The dataset pertains to a fictitious bank that has a growing customer base. The bank desires to expand its loan business and subsequently earn interest on the loans. In particular, the management seeks to target its liability customers (depositors) to purchase their personal loans. The bank ran a campaign last year, and the liability customers showed a healthy conversion rate of over 9% success. To reduce the cost of the campaign, the bank's marketing department wants to target their liability customers with a minimal budget cost. The bank's marketing department is interested in knowing how a machine learning algorithm or model can better predict a higher probability of purchasing the loan.
The objective of this project is to predict whether customers will purchase the personal loan or not, and which machine learning algorithm will provide the best accuracy.
My motivation for pursuing this project is to learn how machine learning algorithms can be useful in financial institutions.
The data source used in this project is Kaggle. The URL of the dataset is below: https://www.kaggle.com/itsmesunil/bank-loan-modelling.
# Import all of the essential libraries.
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
import plotly.express as px
import plotly.graph_objects as go
import warnings
from IPython.display import Image
from six import StringIO
from sklearn.tree import export_graphviz
import pydotplus
from sklearn import tree
from plotly import __version__
import cufflinks as cf
from plotly.offline import download_plotlyjs, init_notebook_mode, plot, iplot
init_notebook_mode(connected = True)
cf.go_offline()
# Create a data frame called credit.
credit = pd.read_excel('Bank.xlsx', sheet_name= 'Data')
# To get the first 10 rows, use the head() function.
credit.head(10)
| ID | Age | Experience | Income | ZIP Code | Family | CCAvg | Education | Mortgage | Personal Loan | Securities Account | CD Account | Online | CreditCard | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 1 | 25 | 1 | 49 | 91107 | 4 | 1.6 | 1 | 0 | 0 | 1 | 0 | 0 | 0 |
| 1 | 2 | 45 | 19 | 34 | 90089 | 3 | 1.5 | 1 | 0 | 0 | 1 | 0 | 0 | 0 |
| 2 | 3 | 39 | 15 | 11 | 94720 | 1 | 1.0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 4 | 35 | 9 | 100 | 94112 | 1 | 2.7 | 2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4 | 5 | 35 | 8 | 45 | 91330 | 4 | 1.0 | 2 | 0 | 0 | 0 | 0 | 0 | 1 |
| 5 | 6 | 37 | 13 | 29 | 92121 | 4 | 0.4 | 2 | 155 | 0 | 0 | 0 | 1 | 0 |
| 6 | 7 | 53 | 27 | 72 | 91711 | 2 | 1.5 | 2 | 0 | 0 | 0 | 0 | 1 | 0 |
| 7 | 8 | 50 | 24 | 22 | 93943 | 1 | 0.3 | 3 | 0 | 0 | 0 | 0 | 0 | 1 |
| 8 | 9 | 35 | 10 | 81 | 90089 | 3 | 0.6 | 2 | 104 | 0 | 0 | 0 | 1 | 0 |
| 9 | 10 | 34 | 9 | 180 | 93023 | 1 | 8.9 | 3 | 0 | 1 | 0 | 0 | 0 | 0 |
# To explore the variables in the data set, use the info() function.
credit.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 5000 entries, 0 to 4999 Data columns (total 14 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 ID 5000 non-null int64 1 Age 5000 non-null int64 2 Experience 5000 non-null int64 3 Income 5000 non-null int64 4 ZIP Code 5000 non-null int64 5 Family 5000 non-null int64 6 CCAvg 5000 non-null float64 7 Education 5000 non-null int64 8 Mortgage 5000 non-null int64 9 Personal Loan 5000 non-null int64 10 Securities Account 5000 non-null int64 11 CD Account 5000 non-null int64 12 Online 5000 non-null int64 13 CreditCard 5000 non-null int64 dtypes: float64(1), int64(13) memory usage: 547.0 KB
# To examine the mean, standard deviation, min, 25th percentile, 50th percentile, 75th percentile,
# and max values, use the describe and transpose () functions.
credit.describe().transpose()
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| ID | 5000.0 | 2500.500000 | 1443.520003 | 1.0 | 1250.75 | 2500.5 | 3750.25 | 5000.0 |
| Age | 5000.0 | 45.338400 | 11.463166 | 23.0 | 35.00 | 45.0 | 55.00 | 67.0 |
| Experience | 5000.0 | 20.104600 | 11.467954 | -3.0 | 10.00 | 20.0 | 30.00 | 43.0 |
| Income | 5000.0 | 73.774200 | 46.033729 | 8.0 | 39.00 | 64.0 | 98.00 | 224.0 |
| ZIP Code | 5000.0 | 93152.503000 | 2121.852197 | 9307.0 | 91911.00 | 93437.0 | 94608.00 | 96651.0 |
| Family | 5000.0 | 2.396400 | 1.147663 | 1.0 | 1.00 | 2.0 | 3.00 | 4.0 |
| CCAvg | 5000.0 | 1.937913 | 1.747666 | 0.0 | 0.70 | 1.5 | 2.50 | 10.0 |
| Education | 5000.0 | 1.881000 | 0.839869 | 1.0 | 1.00 | 2.0 | 3.00 | 3.0 |
| Mortgage | 5000.0 | 56.498800 | 101.713802 | 0.0 | 0.00 | 0.0 | 101.00 | 635.0 |
| Personal Loan | 5000.0 | 0.096000 | 0.294621 | 0.0 | 0.00 | 0.0 | 0.00 | 1.0 |
| Securities Account | 5000.0 | 0.104400 | 0.305809 | 0.0 | 0.00 | 0.0 | 0.00 | 1.0 |
| CD Account | 5000.0 | 0.060400 | 0.238250 | 0.0 | 0.00 | 0.0 | 0.00 | 1.0 |
| Online | 5000.0 | 0.596800 | 0.490589 | 0.0 | 0.00 | 1.0 | 1.00 | 1.0 |
| CreditCard | 5000.0 | 0.294000 | 0.455637 | 0.0 | 0.00 | 0.0 | 1.00 | 1.0 |
# View a dataset's dimensions.
credit.shape
(5000, 14)
Feature engineering was performed to detect outliers, na values, and null values in the dataset. Data tidying performed to get a better performance from the model.
# Drop non-essential features like ID and ZIP Code from the dataset using the drop() function.
credit1 = credit.drop(['ID','ZIP Code'],axis = 1)
# To extract the first 5 rows of the credit1 dataset, use the head() function.
credit1.head()
| Age | Experience | Income | Family | CCAvg | Education | Mortgage | Personal Loan | Securities Account | CD Account | Online | CreditCard | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 25 | 1 | 49 | 4 | 1.6 | 1 | 0 | 0 | 1 | 0 | 0 | 0 |
| 1 | 45 | 19 | 34 | 3 | 1.5 | 1 | 0 | 0 | 1 | 0 | 0 | 0 |
| 2 | 39 | 15 | 11 | 1 | 1.0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 |
| 3 | 35 | 9 | 100 | 1 | 2.7 | 2 | 0 | 0 | 0 | 0 | 0 | 0 |
| 4 | 35 | 8 | 45 | 4 | 1.0 | 2 | 0 | 0 | 0 | 0 | 0 | 1 |
After the unnecessary features such as ID and ZIP Code, the remaining data features are shown above in the data table.
# View the dataset's new dimension.
credit1.shape
(5000, 12)
The new dimensiion of the dataset is 5000 * 12.
# Look for null values in the data set. Display the null values on the heatmap if any.
plt.figure(figsize = (8,7))
sns.heatmap(credit1.isnull(),yticklabels = False, cbar = False, cmap = 'viridis')
plt.title('Heatmap')
credit1.isnull().sum()
Age 0 Experience 0 Income 0 Family 0 CCAvg 0 Education 0 Mortgage 0 Personal Loan 0 Securities Account 0 CD Account 0 Online 0 CreditCard 0 dtype: int64
There are no null values in the dataset, as can be seen in heatmap above.
# Look for na values in the data set. Display the na values on the heatmap if any.
plt.figure(figsize = (8,7))
sns.heatmap(credit1.isna(),yticklabels = False, cbar = False, cmap = 'viridis')
plt.title('Heatmap')
credit1.isna().sum()
Age 0 Experience 0 Income 0 Family 0 CCAvg 0 Education 0 Mortgage 0 Personal Loan 0 Securities Account 0 CD Account 0 Online 0 CreditCard 0 dtype: int64
There are no na values in the dataset, as evidenced by the heatmap above.
# To discover outliers and exclude them from the original dataset,
# use an interquartile range (IQR) approach.
# Extract credit1 continuous features such as Income, CCAvg, Mortgage, age, and experience.
cols = credit1[["Income", "CCAvg","Mortgage","Age","Experience"]]
# Detect outliers using an interquartile range (IQR) approach.
Q1 = np.percentile(cols,25, interpolation= 'midpoint')
Q3 = np.percentile(cols,75, interpolation= 'midpoint')
# Create an IQR variable that takes the difference between the third and first quartiles.
IQR = Q3 - Q1
# Print the original dataset shape.
print("Original Credit dataset shape:",credit1.shape)
# Define a variable called Upper bound.
Upper = cols >= (Q3 + 1.5 * IQR)
# # Define a variable called Lower bound.
Lower = cols <= (Q1 - 1.5 * IQR)
# Exclude the potential outliers from the dataset. Give a name credit2 to the dataframe.
credit2 = credit1[~(Upper | Lower).any(axis=1)]
# The complement operator, the tilde operator, is utilized.
print("New Credit dataset shape:", credit2.shape)
Original Credit dataset shape: (5000, 12) New Credit dataset shape: (3477, 12)
After removing outliers from the original dataset, 3477 observations and 12 features were obtained.
# To ignore the warning, use the filterwarnings() function.
warnings.filterwarnings('ignore')
# Visualize the distribution in the subplots to see the before and
# after picture of the Income variable in particular.
plt.figure(figsize = (16,8))
plt.subplot(2,2,1)
sns.distplot(credit1['Income'])
plt.title('Distribution plot of Income Variable')
plt.subplot(2,2,2)
sns.boxplot(credit1['Income'])
plt.title('Box plot of Income Variable')
plt.subplot(2,2,3)
sns.distplot(credit2['Income'])
plt.title('Distribution plot of Income Variable')
plt.subplot(2,2,4)
sns.boxplot(credit2['Income'])
plt.title('Box plot of Income Variable')
plt.tight_layout()
The distribution and box plot of an Income variable have significantly improved. As a result, it can assist in improving the performance of machine learning models.
# To display statistical values, use the describe and transpose functions.
credit2.describe().transpose()
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| Age | 3477.0 | 45.509060 | 11.454379 | 23.0 | 35.0 | 46.0 | 55.0 | 67.0 |
| Experience | 3477.0 | 20.226632 | 11.479098 | -3.0 | 10.0 | 20.0 | 30.0 | 43.0 |
| Income | 3477.0 | 56.237274 | 29.936561 | 8.0 | 32.0 | 52.0 | 80.0 | 125.0 |
| Family | 3477.0 | 2.475410 | 1.163695 | 1.0 | 1.0 | 2.0 | 4.0 | 4.0 |
| CCAvg | 3477.0 | 1.484920 | 1.172683 | 0.0 | 0.5 | 1.3 | 2.1 | 8.0 |
| Education | 3477.0 | 1.947944 | 0.829126 | 1.0 | 1.0 | 2.0 | 3.0 | 3.0 |
| Mortgage | 3477.0 | 15.720161 | 36.964702 | 0.0 | 0.0 | 0.0 | 0.0 | 127.0 |
| Personal Loan | 3477.0 | 0.031924 | 0.175823 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
| Securities Account | 3477.0 | 0.106414 | 0.308411 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
| CD Account | 3477.0 | 0.038539 | 0.192521 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 |
| Online | 3477.0 | 0.599080 | 0.490155 | 0.0 | 0.0 | 1.0 | 1.0 | 1.0 |
| CreditCard | 3477.0 | 0.295657 | 0.456403 | 0.0 | 0.0 | 0.0 | 1.0 | 1.0 |
After excluding outliers, the statistical values of all variables drastically improved.
# Show an Outliers on the heatmap.
plt.figure(figsize = (16,9))
plt.subplot(2,2,1)
nn1 = credit1[~(Upper | Lower).any(axis=1)]
sns.heatmap(nn1, cmap = 'coolwarm')
plt.title("Heatmap with Outliers")
plt.subplot(2,2,2)
nn2 = credit1[(Upper | Lower).any(axis=1)]
sns.heatmap(nn2, cmap = 'coolwarm')
print('Credit Shape without Outliers:',nn1.shape)
print('Credit Shape with Outliers:',nn2.shape)
plt.title("Heatmap without Outliers")
Credit Shape without Outliers: (3477, 12) Credit Shape with Outliers: (1523, 12)
Text(0.5, 1.0, 'Heatmap without Outliers')
The heatmaps show the true picture of the dataset with or without outliers.
I explored the data using advanced visualization libraries such as seaborn, matplotlib, plotly, cufflinks, and plotlyexpress. This section gives a through tour about the features in the dataset.
# To visualize the significant variables, use the pair plot()function.
plt.style.use('ggplot')
plt.figure(figsize=(20,15))
sns.pairplot(credit2, hue = 'Education', palette = 'viridis',
vars = ('Age','Experience','Income','CCAvg',
'Personal Loan','CreditCard',
'Securities Account',
'CD Account','Online'))
plt.show()
plt.tight_layout()
<Figure size 1440x1080 with 0 Axes>
<Figure size 432x288 with 0 Axes>
The true depiction of the significant variables can be seen using a pair plot. It also shows how variables are related to each other. Furthermore, the color of the data points indicates the type of education.
# Create a plot of a correlation matrix.
plt.figure(figsize = (10,7))
# Create a variable called cc to calculate correlation.
cc = credit2.corr()
# To visualize the correlation matrix, use a heatmap.
cc1 = sns.heatmap(cc,cmap = 'viridis',
linecolor = 'black', linewidths = 1,
annot = True).axes.set_title("Correlation Matrix Plot")
cc1
plt.show()
The correlation matrix plot shows whether there is a true positive or negative relationship between variables. It is an accurate description of the relationships between variables. In practice, it's an extensively utilized statistical tool.
# Using the seaborn library, visualize a hierarchical-clustermap.
sns.clustermap(credit2, cmap = "BrBG_r",figsize=(10, 9),
row_cluster=False,
dendrogram_ratio=(.1, .2),
cbar_pos=(0, .2, .03, .4))
<seaborn.matrix.ClusterGrid at 0x127e7a8e0>
The dendrograms on the top of the x-axis are shown in a hierarchical-clustering map. They demonstrate a similarity among the variables. A color bar scale is used to indicate the values of each variable.
# Show a violin plot of education versus average credit card spending,
# with data points segregated by personal loan.
plt.style.use('ggplot')
plt.figure(figsize = (16,9))
sns.violinplot(x = 'Education', y = 'CCAvg' ,
data = credit2, hue = 'Personal Loan',
split = True)
# Show a legend on the right-hand side of the plot.
plt.legend(loc = "center left", bbox_to_anchor = (1,0.5),
title = 'Personal Loan')
plt.title("Violin Plot of education versus average spending on credit cards per month")
plt.ylabel("Average spending on credit cards per month")
Text(0, 0.5, 'Average spending on credit cards per month')
The violin plot indicates the kernel density estimation of the underlying distribution. Moreover, it depicts more information than a box plot. It also demonstrates the median of the average spending of the customers on a credit card per month.
# Using the plotly express library, create an interactive box plot.
plt.style.use('ggplot')
plt.figure(figsize = (14,7))
# Show an Interactive box plot.
fig = px.box(credit2, x = 'Family', y = 'Income', points = 'all',
color = "Personal Loan", title = "Box Plot")
fig.update_traces(quartilemethod="inclusive")
fig.show()
<Figure size 1008x504 with 0 Axes>
Data points are distinguished and displayed by personal loan type alongside a box whisker plot in a box plot created with the plotly express library. Furthermore, the data points are categorized by size of family. Customers who accepted a personal loan based on the highest median value belonged to a family with two members.
# Filter an income variable and create a distinct income distribution plot.
plt.style.use('ggplot')
plt.figure(figsize=(14,7))
# Construct distribution plots.
credit2[credit2['Personal Loan'] == 1]['Income'].hist(bins = 30, alpha = 0.5,
color = 'blue', label = 'Personal Loan = 1')
credit2[credit2['Personal Loan'] == 0]['Income'].hist(bins = 30, alpha = 0.5,
color = 'red', label = 'Personal Loan = 0')
plt.legend()
plt.xlabel('Income')
plt.title( 'Distribution Plot of Income')
Text(0.5, 1.0, 'Distribution Plot of Income')
After filtering the type of personal loan, the underlying distribution of customers who accepted the loan is much smaller than the underlying distribution of customers who did not accept the loan, as shown in the distribution plot.
# Joint plot/ Marginal histogram
plt.style.use('ggplot')
sns.jointplot(x = 'Income',y = 'CCAvg' , data = credit2,
kind = 'reg',scatter_kws={'alpha':0.15}, line_kws={'color': 'red'},
height = 8, ratio = 10, dropna= True)
# Show a title of the marginal histogram.
plt.suptitle("Marginal Histogram of Income versus CCAvg", y =1)
Text(0.5, 1, 'Marginal Histogram of Income versus CCAvg')
The underlying distribution and kernel density estimation are shown in a marginal histogram. Furthermore, it depicts a scatter plot with a positive linear relationship. As a result, higher-income corresponds to higher monthly credit card expenditure.
# Using faceting, the ggplot grid style, visualize the kernel density estimate plot.
plt.style.use('ggplot')
# Facet grid
f = sns.FacetGrid(data = credit2, col = "Online" , row = "CreditCard")
f.map(sns.kdeplot, "Income")
plt.show()
A facet grid is another way to glance at the data. I utilized the facet grid to display more information about customers who owned a credit card, used internet banking, and had a sufficient income. The customer's income is represented on the x-axis. On the other hand, the kernel density estimation (KDE) presents a smooth density plot where all the specified conditions were satisfied.
# Visualize a count plot of education that is colored by the type of personal loan.
plt.figure(figsize = (12,6))
plt.subplot(2,2,1)
sns.countplot(x ='Education' ,hue = 'Personal Loan', data = credit2)
plt.title("Count Plot of Education", loc = "center")
plt.tight_layout()
# VVisualize a count plot of family that is colored by the credit card issued by universal bank.
plt.subplot(2,2,2)
sns.countplot(x ='Family' ,hue = 'CreditCard', data = credit2)
plt.title("Count Plot of Family", loc = "center")
plt.tight_layout()
# Visualize a count plot of education that is colored by an online banking.
plt.subplot(2,2,3)
sns.countplot(x ='Family' ,hue = 'Online', data = credit2)
plt.title("Count Plot of Family", loc = "center")
plt.tight_layout()
# Visualize a count plot of education that is colored by the securities account.
plt.subplot(2,2,4)
sns.countplot(x ='Education' ,hue = 'Securities Account', data = credit2)
plt.title("Count Plot of Education", loc = "center")
plt.tight_layout()
The count plot is a powerful statistical tool for exploring the bulk of data in terms of data exploration. Using count plots, I was able to see the differences between several variables. The count plots did a good job of describing it. It can also be read by someone who isn't a statistician. As a result, the count plots showed me a quick overview of the key variables in the dataset.
The following supervised machine learning algorithms are used in this project: Logistic Regression, K-nearest neighbors (KNN), decision tree, random forest, support vector machine (SVM), and artificial neural network (ANN).
# Define the variables X and y.
X = credit2.drop('Personal Loan', axis = 1)
y = credit2['Personal Loan']
# Import a library called train_test_split from scikit learn.
from sklearn.model_selection import train_test_split
# Split 30% of the data to the test data set.
X_train, X_test, y_train, y_test = train_test_split(X, y,
test_size=0.3, random_state=101)
# Import LogisticRegression from scikit learn linear family.
from sklearn.linear_model import LogisticRegression
# Define an instance of LogisticRegression called logmodel.
logmodel = LogisticRegression()
# Fit a training data to the logistic model.
logmodel.fit(X_train,y_train )
LogisticRegression()
# Predict the logistic model. Give a name predictions to the variable.
predictions = logmodel.predict(X_test)
# Import an evaluation metrics from scikit learn metrics.
from sklearn.metrics import classification_report,confusion_matrix
# Show a report on evaluation metrics.
print(classification_report(y_test, predictions))
print('\n')
# Show a confusion matrix on a heatmap.
class_names=[0,1] # name of classes
fig, ax = plt.subplots()
tick_marks = np.arange(len(class_names))
plt.xticks(tick_marks, class_names)
plt.yticks(tick_marks, class_names)
cnf_matrix = confusion_matrix(y_test, predictions)
sns.heatmap(pd.DataFrame(cnf_matrix), annot=True,
cmap="viridis" ,fmt='g')
ax.xaxis.set_label_position("top")
ax.xaxis.set_ticks_position("top")
plt.title('Confusion matrix', y=1.2)
plt.ylabel('Actual label')
plt.xlabel('Predicted label')
plt.tight_layout()
precision recall f1-score support
0 0.98 1.00 0.99 1012
1 0.69 0.34 0.46 32
accuracy 0.98 1044
macro avg 0.83 0.67 0.72 1044
weighted avg 0.97 0.98 0.97 1044
The classification evaluation report and the confusion matrix demonstrated the logistic model's remarkable performance. The classification report included four key evaluation metrics: precision, recall, f1-score, and accuracy, some of which indicated that the overall outcome was impressive. The confusion matrix also Indicates the number of false positives (FP), false negatives (FN), total positives (TP), and total negatives (TN). False-negative, or FN (also known as Type-2 error rate), refers to customers who obtained a personal loan, but the prediction showed they didn't purchase a personal loan. On the other hand, FP (also known as Type-1 error rate) denotes customers who did not purchase personal loans but were predicted to do so. As a result, it's referred to as misclassification. The confusion matrix revealed the true representation of actual versus predicted values.
# Show a binary classification plot with each form of personal loan colored in a different color.
plt.figure(figsize = (12,4))
plt.title("Binary classification plot")
xlog = credit2[['CCAvg']].values
ylog = credit2[['Personal Loan']].values
# Use sns.regplot to show the logistic regression curve.
sns.regplot(xlog, ylog, logistic=True, color='green')
plt.axhline(.5, color="maroon", label="cutoff", ls = '--')
plt.legend(loc = 'best')
plt.show()
A binary classification graph closely reflects data points on 0 and 1, and is colored according to the type of personal loan. The x-axis represents a certain age group, which helps convey relevant information. The cutoff, 0.5, is used inside the binary classification plot.
# Import metrics from scikit learn.
from sklearn import metrics
log_acc = metrics.accuracy_score(y_test, predictions)*100
# Print accuracy, precision, recall, and f1 score.
print("Accuracy:",round(metrics.accuracy_score(y_test, predictions)*100,3),"%.")
print("Precision:",round(metrics.precision_score(y_test, predictions)*100,3),"%.")
print("Recall:",round(metrics.recall_score(y_test, predictions)*100,3),"%.")
print("F1 Score:",round(metrics.f1_score(y_test, predictions)*100,3),"%.")
Accuracy: 97.51 %. Precision: 68.75 %. Recall: 34.375 %. F1 Score: 45.833 %.
# Manually computed evaluation metrics such as accuracy, precision, recall, and f1 score.
# Define variables TP,TN,FP, and FN.
TP = 11
TN = 1007
FP = 5
FN = 21
# Compute Accuracy and store to a variable called accuracy.
accuracy = (TP+TN)/(TP+TN+FP+FN)
print('The accuracy is,',round(accuracy,2)*100,"%.")
# Compute Precision and store to a variable called Precision.
Precision = (TP) / (TP+FP)
print('The precision is,',round(Precision,2)*100,"%.")
# Compute Recall and store to a variable called recall.
recall = (TP) / (TP+FN)
print('The recall is,',round(recall,2)*100,"%.")
# Compute F1 score and store to a variable called F1.
F1 = 2 * ((Precision * recall) / (Precision + recall))
print('The F1-score is,',round(F1,2)*100,"%.")
# Compute senstivity and store to a variable called sensti.
sensti = (TP) / (TP + FN)
print('The senstivity is,',round(sensti,2)*100,"%.")
# Compute specificity and store to a variable called specifi.
specifi = (TN) / (TN + FP)
print('The specificity is,',round(specifi,3)*100,"%.")
The accuracy is, 98.0 %. The precision is, 69.0 %. The recall is, 34.0 %. The F1-score is, 46.0 %. The senstivity is, 34.0 %. The specificity is, 99.5 %.
Accuracy indicates how close a predicted value is to the true value. Therefore, a logistic model that showed 98% accuracy is quite impressive. I manually computed the sensitivity and specificity as well. The low sensitivity indicates many false negatives. On the other hand, the high specificity shows few false positives.
# Plot Receiver Operating Characteristic (ROC) curve.
plt.figure(figsize= (10,7))
# Compute the predicted probability by using a predict_proba() function.
y_pred_prob = logmodel.predict_proba(X_test)[::,1]
fpr, tpr, _ = metrics.roc_curve(y_test, y_pred_prob)
auc = metrics.roc_auc_score(y_test, y_pred_prob)
# Plot the ROC curve and line join the end points.
plt.plot(fpr,tpr,label="Logistic, auc = {}".format(round(auc,2)),
color = "blue",ls ='-',
marker = 'o',lw = 2)
plt.plot([0,1],[0,1],lw = 2, color = 'blue')
# Define legend and update x and y lables of a ROC curve.
plt.legend(loc=4, bbox_to_anchor = (1,0))
plt.xlabel("False Positive rate")
plt.ylabel("True Positive rate")
plt.title('Receiver Operating Characteristic (ROC) curve')
plt.show()
Receiver Operating Characteristic (ROC) curve is a plot of the true positive rate against the false positive rate. The area under curve (auc) indicates an excellent discriminatory ability.Higher the AUC value, better the model.
# Check an error rate.
log_err = round(np.mean(y_test != predictions)*100,3)
print("The error rate is",log_err,"%.")
The error rate is 2.49 %.
The error rate indicates the true representation of the actual values against the predicted values. Simply, it means at what proportion the predicted values were wrong. The percentage of the error rate in a logistic model is less than 5%, which isn't a bad sign for the generalization of the outcomes.
# Import standard scaler from scikit learn.
from sklearn.preprocessing import StandardScaler
# Create the scaler variable. Scaler is a standard scaler instance.
scaler = StandardScaler()
# Standardize the data.
scaler.fit(credit2.drop('Personal Loan', axis = 1))
StandardScaler()
# Use a scaler.transform() function to transform the data to a standard scale.
scaled_feature = scaler.transform(credit2.drop('Personal Loan', axis = 1))
col = credit2.drop('Personal Loan', axis = 1)
# Create a variable col to look at the standardize columns.
col = col.columns
# Create a standardize version dataframe.
df = pd.DataFrame(scaled_feature, columns = col)
df.head()
| Age | Experience | Income | Family | CCAvg | Education | Mortgage | Securities Account | CD Account | Online | CreditCard | |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | -1.790757 | -1.675166 | -0.241788 | 1.310318 | 0.098148 | -1.143469 | -0.425336 | 2.897809 | -0.200209 | -1.2224 | -0.647891 |
| 1 | -0.044449 | -0.106873 | -0.742920 | 0.450862 | 0.012861 | -1.143469 | -0.425336 | 2.897809 | -0.200209 | -1.2224 | -0.647891 |
| 2 | -0.568341 | -0.455383 | -1.511322 | -1.268049 | -0.413573 | -1.143469 | -0.425336 | -0.345088 | -0.200209 | -1.2224 | -0.647891 |
| 3 | -0.917603 | -0.978147 | 1.462059 | -1.268049 | 1.036303 | 0.062794 | -0.425336 | -0.345088 | -0.200209 | -1.2224 | -0.647891 |
| 4 | -0.917603 | -1.065275 | -0.375424 | 1.310318 | -0.413573 | 0.062794 | -0.425336 | -0.345088 | -0.200209 | -1.2224 | 1.543469 |
# Define X and y variables.
X = df
y = credit2['Personal Loan']
# Split 30% standardize version of the data to the test set. Set a random state equal to 101.
X_train1, X_test1, y_train1, y_test1 = train_test_split(X, y,
test_size=0.3, random_state=101)
# Import KNeighborsClassifier from scikit learn neighbors family.
from sklearn.neighbors import KNeighborsClassifier
# Pass an argument number of neighbors to KNeighborsClassifier, give a name to the variable knn.
knn = KNeighborsClassifier(n_neighbors = 1)
# Fit a knn model.
knn.fit(X_train1, y_train1)
KNeighborsClassifier(n_neighbors=1)
# Define a variable pred which stores knn predictions.
pred = knn.predict(X_test1)
# Show a report on performance evaluation metrics.
# Call a function called classification_report and show the evaluation metrics.
print(classification_report(y_test1, pred))
print('\n')
# Show a confusion matrix on a heatmap.
cnf_matrix = confusion_matrix(y_test1, pred)
class_names=[0,1] # name of classes
fig, ax = plt.subplots()
tick_marks = np.arange(len(class_names))
plt.xticks(tick_marks, class_names)
plt.yticks(tick_marks, class_names)
# Use a seaborn library and display the confusion matrix on the heatmap.
sns.heatmap(pd.DataFrame(cnf_matrix), annot=True,
cmap="plasma" ,fmt='g')
# Update x-axis and y-axis labels and positions.
ax.xaxis.set_label_position("top")
ax.xaxis.set_ticks_position("top")
plt.title('Confusion matrix', y=1.2)
plt.ylabel('Actual label')
plt.xlabel('Predicted label')
plt.tight_layout()
precision recall f1-score support
0 0.98 0.99 0.99 1012
1 0.68 0.47 0.56 32
accuracy 0.98 1044
macro avg 0.83 0.73 0.77 1044
weighted avg 0.97 0.98 0.97 1044
The classification report showed a good performance using a k-neighbors equal to 1. The accuracy of a knn model is amazing. Furthermore, precision, recall, and f1 score are significantly improved compared to a logistic model. Furthermore, the confusion matrix misclassified only 24 total values. Without a doubt, the final result is superior to a logistic model.
# Apply an elbow method to choose a correct/optimize k value with elbow method.
error_rate = [] # empty array list
# Check every possible k values from 1 to 50
for i in range(1,50):
knn = KNeighborsClassifier(n_neighbors = i)
knn.fit(X_train1, y_train1)
predic = knn.predict(X_test1)
error_rate.append(np.mean(predic != y_test1))
# Plot an error rate.
plt.figure(figsize= (14,6))
plt.plot(range(1,50), error_rate, color = 'blue', ls = '--', marker = 'o', markerfacecolor = 'red',markersize = 13)
plt.title("Error rate vs k-value")
plt.xlabel("k value")
plt.ylabel('error rate')
Text(0, 0.5, 'error rate')
Thus, after applying an elbow method, as the k value increases, the error rate also increases. As a result, more k-neighbors were considered in a model more elements were also taken into account.
# Use an optimize k-neighbor equal to 3. Repeat the process again.
knn = KNeighborsClassifier (n_neighbors = 3)
knn.fit(X_train1, y_train1)
p = knn.predict(X_test1)
knn_acc = metrics.accuracy_score(y_test1, p)*100
print("Accuracy:",round(metrics.accuracy_score(y_test1, p)*100,2),".")
print("Precision:",round(metrics.precision_score(y_test1, p)*100,2),".")
print("Recall:",round(metrics.recall_score(y_test1, p)*100,2),".")
print("F1 Score:",round(metrics.f1_score(y_test1, p)*100,2),".")
# Show a report on performance evaluation metrics.
# Call a function called classification_report and show the evaluation metrics.
print(classification_report(y_test1, p))
print('\n')
# Show a confusion matrix on a heatmap.
cnf_matrix = confusion_matrix(y_test1, p)
class_names=[0,1] # name of classes
fig, ax = plt.subplots()
tick_marks = np.arange(len(class_names))
plt.xticks(tick_marks, class_names)
plt.yticks(tick_marks, class_names)
# Use a seaborn library and display the confusion matrix on the heatmap.
sns.heatmap(pd.DataFrame(cnf_matrix), annot=True,
cmap="magma" ,fmt='g')
# Update x-axis and y-axis labels and positions.
ax.xaxis.set_label_position("top")
ax.xaxis.set_ticks_position("top")
plt.title('Confusion matrix', y=1.2)
plt.ylabel('Actual label')
plt.xlabel('Predicted label')
plt.tight_layout()
Accuracy: 97.8 .
Precision: 76.47 .
Recall: 40.62 .
F1 Score: 53.06 .
precision recall f1-score support
0 0.98 1.00 0.99 1012
1 0.76 0.41 0.53 32
accuracy 0.98 1044
macro avg 0.87 0.70 0.76 1044
weighted avg 0.97 0.98 0.97 1044
The misclassified values were reduced to 23 with an optimal k-neighbor value. The true positive and true negative numbers have also increased slightly.
# Plot a scatter plot, apply knn algorithm.
fig = px.scatter(
X_test1, x = 'Income', y = 'CCAvg',
color = p, color_continuous_scale='sunset',
symbol=y_test1, symbol_map={'0': '*', '1': 'o'},
labels={'symbol': 'Personal Loan', 'color': '<b>Score of class</b>'},
title = "<b>Scatter Plot of Income versus CCAvg using a KNN Algorithm</b>"
)
fig.update_traces(marker_size=12, marker_line_width=1.5)
fig.update_layout(legend_orientation='h')
fig.show()
The KNN Algorithm correctly classified 0 and 1 of personal loans. The numbe 0 indicates the customers who didn't purchase the personal loan. Whereas 1 indicates the customers who purchased the personal loan. The scatter plot depicts the type of personal loan, income, as well as the customers' average monthly credit card usage.
# Find out the true positive and true negative counts in the target variable.
credit2['Personal Loan'].value_counts()
0 3366 1 111 Name: Personal Loan, dtype: int64
As can be seen, it is imbalanced, which may impact the model's performance.
# Compute an error rate where k = 1
print("The error rate with k-neighbors = 1 is",round(np.mean(y_test1 != pred)*100,3),"%.")
# Compute an error rate where k = 3
knn_err = round(np.mean(y_test1 != p)*100,3)
print("The error rate with k-neighbors = 3 is", knn_err,"%.")
The error rate with k-neighbors = 1 is 2.299 %. The error rate with k-neighbors = 3 is 2.203 %.
There is no change in the error rate because the accuracy has not changed. The evaluation metrics and misclassified values, on the other hand, had changed.
# Define X any y variables.
X = credit2.drop('Personal Loan', axis = 1)
y = credit2['Personal Loan']
# Call train_test_split() function to split 30% data to the test set randomly.
X_train2, X_test2, y_train2, y_test2 = train_test_split(X, y,
test_size=0.3, random_state=101)
# Import DecisionTreeClassifier() function from scikit learn tree family.
from sklearn.tree import DecisionTreeClassifier
# Create an instance dtc of DecisionTreeClassifier().
dtc = DecisionTreeClassifier()
# Fit a decision tree model.
dtc.fit(X_train2, y_train2)
DecisionTreeClassifier()
# Predict a decision tree model.
dtc_pred = dtc.predict(X_test2)
# Showcase the performance evaluation metrics.
print(classification_report(y_test2, dtc_pred))
print('\n')
# Using the sns.heatmap() function, create a Confusion Matrix on the heatmap.
cnf_matrix = confusion_matrix(y_test2, dtc_pred)
class_names=[0,1] # name of classes
fig, ax = plt.subplots()
tick_marks = np.arange(len(class_names))
plt.xticks(tick_marks, class_names)
plt.yticks(tick_marks, class_names)
sns.heatmap(pd.DataFrame(cnf_matrix), annot=True,
cmap="rainbow" ,fmt='g')
ax.xaxis.set_label_position("top")
ax.xaxis.set_ticks_position("top")
plt.title('Confusion matrix', y=1.2)
plt.ylabel('Actual label')
plt.xlabel('Predicted label')
plt.tight_layout()
precision recall f1-score support
0 0.99 0.99 0.99 1012
1 0.62 0.66 0.64 32
accuracy 0.98 1044
macro avg 0.80 0.82 0.81 1044
weighted avg 0.98 0.98 0.98 1044
The decision tree model reduced the misclassification values to 21 when compared to the k-nearest neighbors algorithm. Other Important evaluation metrics like recall and f1 score have significantly improved. The precision, on the other hand, was reduced by 0.07. The accuracy has not been affected.
# Print accuracy, precision, recall, and f1 score. Round up by 2 decimal places.
decision_acc = metrics.accuracy_score(y_test2, dtc_pred)*100
print("Accuracy:",round(metrics.accuracy_score(y_test2, dtc_pred)*100,2),"%.")
print("Precision:",round(metrics.precision_score(y_test2, dtc_pred)*100,2),"%.")
print("Recall:",round(metrics.recall_score(y_test2, dtc_pred)*100,2),"%.")
print("F1 Score:",round(metrics.f1_score(y_test2, dtc_pred)*100,2),"%.")
Accuracy: 97.7 %. Precision: 61.76 %. Recall: 65.62 %. F1 Score: 63.64 %.
# Calculate the decision tree model's error rate.
decision_err = round(np.mean(y_test2 != dtc_pred)*100,3)
print("The error rate is",decision_err,"%.")
The error rate is 2.299 %.
Compared to the logistic and knn models, the error rate is slightly lower. As a result, the decision tree is the most effective model thus far.
# Visualize decision tree model using a export_graphviz and pydotplus libraries.
plt.figure(figsize= (16,9))
features = list(credit2.columns[1:])
dot_data = StringIO()
export_graphviz(dtc, out_file = dot_data,
feature_names=features,
filled=True,rounded=False,
special_characters=True,
class_names=['0','1'])
graph = pydotplus.graph_from_dot_data(dot_data.getvalue())
Image(graph.create_png())
<Figure size 1152x648 with 0 Axes>
The decision tree is a very useful tool for making decisions. It has a top-level root node that is a family. There are decision nodes and terminal nodes, which are also known as leaves. A personal loan's data is classified using the decision tree. As a result, it's easy to understand even when you're not a statistician.
# Define X and y variables.
X = credit2.drop('Personal Loan', axis = 1)
y = credit2['Personal Loan']
# Call train_test_split() function to split 30% data to the test set randomly.
X_train3, X_test3, y_train3, y_test3 = train_test_split(X, y,
test_size=0.3, random_state=101)
# Import RandomForestClassifier() function from scikit learn library.
from sklearn.ensemble import RandomForestClassifier
# Create an instance rfc of RandomForestClassifier().
rfc = RandomForestClassifier(n_estimators = 500, random_state= 50)
# Fit a training data to a random forest model.
rfc.fit(X_train3, y_train3)
RandomForestClassifier(n_estimators=500, random_state=50)
# Predict a random forest model.
rfc_pred = rfc.predict(X_test3)
# Show the performance evaluation metrics.
print(classification_report(y_test3, rfc_pred))
print('\n')
# Use a Confusion Matrix() function to display a confusion matrix on the heatmap.
cnf_matrix = confusion_matrix(y_test3, rfc_pred)
class_names=[0,1] # name of classes
fig, ax = plt.subplots()
tick_marks = np.arange(len(class_names))
plt.xticks(tick_marks, class_names)
plt.yticks(tick_marks, class_names)
sns.heatmap(pd.DataFrame(cnf_matrix), annot=True,
cmap="winter" ,fmt='g')
ax.xaxis.set_label_position("top")
ax.xaxis.set_ticks_position("top")
plt.title('Confusion matrix', y=1.2)
plt.ylabel('Actual label')
plt.xlabel('Predicted label')
plt.tight_layout()
precision recall f1-score support
0 0.99 1.00 0.99 1012
1 0.83 0.59 0.69 32
accuracy 0.98 1044
macro avg 0.91 0.79 0.84 1044
weighted avg 0.98 0.98 0.98 1044
A random forest model is superior to a decision tree model because it has somewhat better precision, misclassified values, and an f1 score. The accuracy, on the other hand, marginally improved.
# Show an accuracy, precision, recall, and f1 score.
random_acc = metrics.accuracy_score(y_test3, rfc_pred)*100
print("Accuracy:",round(metrics.accuracy_score(y_test3, rfc_pred)*100,3),"%.")
print("Precision:",round(metrics.precision_score(y_test3, rfc_pred)*100,3),"%.")
print("Recall:",round(metrics.recall_score(y_test3, rfc_pred)*100,3),"%.")
print("F1 Score:",round(metrics.f1_score(y_test3, rfc_pred)*100,3),"%.")
Accuracy: 98.372 %. Precision: 82.609 %. Recall: 59.375 %. F1 Score: 69.091 %.
# Compute an error_rate of the random forest model.
random_err = round(np.mean(y_test3 != rfc_pred)*100,3)
print("The error rate is",random_err,"%.")
The error rate is 1.628 %.
The rate of error falls by 0.383 %. As a result, the random forest model has truly excelled.
# Visualize a single decision tree in random forest.
features = list(credit2.columns[1:])
fig, axes = plt.subplots(nrows = 1,ncols = 1,figsize = (4,4), dpi=1000)
tree.plot_tree(rfc.estimators_[0],
feature_names = features,
class_names=['0','1'],
filled = True,
rounded = True,
precision = 1);
plt.title('Single Decision Tree in Random Forest')
Text(0.5, 1.0, 'Single Decision Tree in Random Forest')
Random forest can build numerous decision trees from a dataset by selecting observations/rows and specific features/variables at random and then averaging the results. It has a root node "family" on the top, and then it has decision nodes and terminal nodes. It is much readable and interpretable than a complex decision tree network.
# Import SVC() function from scikit learn library.
from sklearn.svm import SVC
# Define X and y variables.
X = credit2.drop('Personal Loan', axis = 1)
y = credit2['Personal Loan']
# Call train_test_split() function to split 30% data to the test set randomly.
X_train4, X_test4, y_train4, y_test4 = train_test_split(X, y,
test_size=0.3, random_state=101)
# Create an instance of support vector classifier (SVC).
svm = SVC()
# Fit a training data to the svm model.
svm.fit(X_train4, y_train4)
SVC()
# Use svm.predict() function to predict the model.
svm_pred = svm.predict(X_test4)
# Show the peformance evaluation metrics of a svm model.
print(classification_report(y_test4, svm_pred))
print('\n')
# Display a Confusion Matrix on the heatmap.
cnf_matrix = confusion_matrix(y_test4, svm_pred)
class_names=[0,1] # name of classes
fig, ax = plt.subplots()
tick_marks = np.arange(len(class_names))
plt.xticks(tick_marks, class_names)
plt.yticks(tick_marks, class_names)
# Use a sns.heatmap() function to visualize the heatmap.
sns.heatmap(pd.DataFrame(cnf_matrix), annot=True,
cmap="seismic" ,fmt='g')
ax.xaxis.set_label_position("top")
ax.xaxis.set_ticks_position("top")
plt.title('Confusion matrix', y=1.2)
plt.ylabel('Actual label')
plt.xlabel('Predicted label')
plt.tight_layout()
precision recall f1-score support
0 0.97 1.00 0.98 1012
1 0.00 0.00 0.00 32
accuracy 0.97 1044
macro avg 0.48 0.50 0.49 1044
weighted avg 0.94 0.97 0.95 1044
As can be seen, SVM only predicted one type of personal loan. We must first choose the best C (tuning/regularization parameter) to take both classes into account.
# Import grid search cross validationfrom scikit learn library.
from sklearn.model_selection import GridSearchCV
# Use a grid search cross validation to find the best C and gamma tunning parameters.
param_grid1 = {'C':[0.1,1,10,100,1000], 'gamma':[1, 0.1, 0.01, 0.001, 0.0001]}
grid = GridSearchCV(SVC(), param_grid = param_grid1, verbose = 3)
# Higher the verbose number, the more text messages"
# Fit a model.
grid.fit(X_train4, y_train4)
Fitting 5 folds for each of 25 candidates, totalling 125 fits [CV 1/5] END ....................C=0.1, gamma=1;, score=0.967 total time= 0.2s [CV 2/5] END ....................C=0.1, gamma=1;, score=0.967 total time= 0.2s [CV 3/5] END ....................C=0.1, gamma=1;, score=0.967 total time= 0.2s [CV 4/5] END ....................C=0.1, gamma=1;, score=0.969 total time= 0.2s [CV 5/5] END ....................C=0.1, gamma=1;, score=0.967 total time= 0.2s [CV 1/5] END ..................C=0.1, gamma=0.1;, score=0.967 total time= 0.2s [CV 2/5] END ..................C=0.1, gamma=0.1;, score=0.967 total time= 0.2s [CV 3/5] END ..................C=0.1, gamma=0.1;, score=0.967 total time= 0.2s [CV 4/5] END ..................C=0.1, gamma=0.1;, score=0.969 total time= 0.2s [CV 5/5] END ..................C=0.1, gamma=0.1;, score=0.967 total time= 0.2s [CV 1/5] END .................C=0.1, gamma=0.01;, score=0.967 total time= 0.0s [CV 2/5] END .................C=0.1, gamma=0.01;, score=0.967 total time= 0.0s [CV 3/5] END .................C=0.1, gamma=0.01;, score=0.967 total time= 0.0s [CV 4/5] END .................C=0.1, gamma=0.01;, score=0.969 total time= 0.0s [CV 5/5] END .................C=0.1, gamma=0.01;, score=0.967 total time= 0.1s [CV 1/5] END ................C=0.1, gamma=0.001;, score=0.967 total time= 0.0s [CV 2/5] END ................C=0.1, gamma=0.001;, score=0.967 total time= 0.0s [CV 3/5] END ................C=0.1, gamma=0.001;, score=0.967 total time= 0.0s [CV 4/5] END ................C=0.1, gamma=0.001;, score=0.969 total time= 0.0s [CV 5/5] END ................C=0.1, gamma=0.001;, score=0.967 total time= 0.0s [CV 1/5] END ...............C=0.1, gamma=0.0001;, score=0.967 total time= 0.0s [CV 2/5] END ...............C=0.1, gamma=0.0001;, score=0.967 total time= 0.0s [CV 3/5] END ...............C=0.1, gamma=0.0001;, score=0.967 total time= 0.0s [CV 4/5] END ...............C=0.1, gamma=0.0001;, score=0.969 total time= 0.0s [CV 5/5] END ...............C=0.1, gamma=0.0001;, score=0.967 total time= 0.0s [CV 1/5] END ......................C=1, gamma=1;, score=0.967 total time= 0.2s [CV 2/5] END ......................C=1, gamma=1;, score=0.967 total time= 0.2s [CV 3/5] END ......................C=1, gamma=1;, score=0.967 total time= 0.2s [CV 4/5] END ......................C=1, gamma=1;, score=0.969 total time= 0.2s [CV 5/5] END ......................C=1, gamma=1;, score=0.967 total time= 0.2s [CV 1/5] END ....................C=1, gamma=0.1;, score=0.967 total time= 0.2s [CV 2/5] END ....................C=1, gamma=0.1;, score=0.967 total time= 0.2s [CV 3/5] END ....................C=1, gamma=0.1;, score=0.967 total time= 0.2s [CV 4/5] END ....................C=1, gamma=0.1;, score=0.969 total time= 0.2s [CV 5/5] END ....................C=1, gamma=0.1;, score=0.967 total time= 0.2s [CV 1/5] END ...................C=1, gamma=0.01;, score=0.963 total time= 0.1s [CV 2/5] END ...................C=1, gamma=0.01;, score=0.969 total time= 0.1s [CV 3/5] END ...................C=1, gamma=0.01;, score=0.967 total time= 0.1s [CV 4/5] END ...................C=1, gamma=0.01;, score=0.969 total time= 0.1s [CV 5/5] END ...................C=1, gamma=0.01;, score=0.967 total time= 0.1s [CV 1/5] END ..................C=1, gamma=0.001;, score=0.967 total time= 0.0s [CV 2/5] END ..................C=1, gamma=0.001;, score=0.967 total time= 0.0s [CV 3/5] END ..................C=1, gamma=0.001;, score=0.967 total time= 0.0s [CV 4/5] END ..................C=1, gamma=0.001;, score=0.969 total time= 0.0s [CV 5/5] END ..................C=1, gamma=0.001;, score=0.967 total time= 0.0s [CV 1/5] END .................C=1, gamma=0.0001;, score=0.967 total time= 0.0s [CV 2/5] END .................C=1, gamma=0.0001;, score=0.967 total time= 0.0s [CV 3/5] END .................C=1, gamma=0.0001;, score=0.967 total time= 0.0s [CV 4/5] END .................C=1, gamma=0.0001;, score=0.969 total time= 0.0s [CV 5/5] END .................C=1, gamma=0.0001;, score=0.967 total time= 0.0s [CV 1/5] END .....................C=10, gamma=1;, score=0.967 total time= 0.2s [CV 2/5] END .....................C=10, gamma=1;, score=0.967 total time= 0.3s [CV 3/5] END .....................C=10, gamma=1;, score=0.967 total time= 0.2s [CV 4/5] END .....................C=10, gamma=1;, score=0.969 total time= 0.2s [CV 5/5] END .....................C=10, gamma=1;, score=0.967 total time= 0.3s [CV 1/5] END ...................C=10, gamma=0.1;, score=0.969 total time= 0.2s [CV 2/5] END ...................C=10, gamma=0.1;, score=0.961 total time= 0.2s [CV 3/5] END ...................C=10, gamma=0.1;, score=0.967 total time= 0.2s [CV 4/5] END ...................C=10, gamma=0.1;, score=0.969 total time= 0.2s [CV 5/5] END ...................C=10, gamma=0.1;, score=0.963 total time= 0.2s [CV 1/5] END ..................C=10, gamma=0.01;, score=0.969 total time= 0.1s [CV 2/5] END ..................C=10, gamma=0.01;, score=0.969 total time= 0.1s [CV 3/5] END ..................C=10, gamma=0.01;, score=0.963 total time= 0.1s [CV 4/5] END ..................C=10, gamma=0.01;, score=0.973 total time= 0.1s [CV 5/5] END ..................C=10, gamma=0.01;, score=0.969 total time= 0.1s [CV 1/5] END .................C=10, gamma=0.001;, score=0.967 total time= 0.0s [CV 2/5] END .................C=10, gamma=0.001;, score=0.963 total time= 0.0s [CV 3/5] END .................C=10, gamma=0.001;, score=0.965 total time= 0.0s [CV 4/5] END .................C=10, gamma=0.001;, score=0.971 total time= 0.0s [CV 5/5] END .................C=10, gamma=0.001;, score=0.963 total time= 0.0s [CV 1/5] END ................C=10, gamma=0.0001;, score=0.967 total time= 0.0s [CV 2/5] END ................C=10, gamma=0.0001;, score=0.967 total time= 0.0s [CV 3/5] END ................C=10, gamma=0.0001;, score=0.967 total time= 0.0s [CV 4/5] END ................C=10, gamma=0.0001;, score=0.969 total time= 0.0s [CV 5/5] END ................C=10, gamma=0.0001;, score=0.967 total time= 0.0s [CV 1/5] END ....................C=100, gamma=1;, score=0.967 total time= 0.2s [CV 2/5] END ....................C=100, gamma=1;, score=0.967 total time= 0.2s [CV 3/5] END ....................C=100, gamma=1;, score=0.967 total time= 0.2s [CV 4/5] END ....................C=100, gamma=1;, score=0.969 total time= 0.2s [CV 5/5] END ....................C=100, gamma=1;, score=0.967 total time= 0.2s [CV 1/5] END ..................C=100, gamma=0.1;, score=0.969 total time= 0.2s [CV 2/5] END ..................C=100, gamma=0.1;, score=0.961 total time= 0.2s [CV 3/5] END ..................C=100, gamma=0.1;, score=0.967 total time= 0.2s [CV 4/5] END ..................C=100, gamma=0.1;, score=0.969 total time= 0.2s [CV 5/5] END ..................C=100, gamma=0.1;, score=0.963 total time= 0.2s [CV 1/5] END .................C=100, gamma=0.01;, score=0.973 total time= 0.0s [CV 2/5] END .................C=100, gamma=0.01;, score=0.973 total time= 0.0s [CV 3/5] END .................C=100, gamma=0.01;, score=0.957 total time= 0.0s [CV 4/5] END .................C=100, gamma=0.01;, score=0.969 total time= 0.0s [CV 5/5] END .................C=100, gamma=0.01;, score=0.965 total time= 0.0s [CV 1/5] END ................C=100, gamma=0.001;, score=0.977 total time= 0.0s [CV 2/5] END ................C=100, gamma=0.001;, score=0.979 total time= 0.0s [CV 3/5] END ................C=100, gamma=0.001;, score=0.971 total time= 0.0s [CV 4/5] END ................C=100, gamma=0.001;, score=0.969 total time= 0.0s [CV 5/5] END ................C=100, gamma=0.001;, score=0.975 total time= 0.0s [CV 1/5] END ...............C=100, gamma=0.0001;, score=0.967 total time= 0.0s [CV 2/5] END ...............C=100, gamma=0.0001;, score=0.967 total time= 0.0s [CV 3/5] END ...............C=100, gamma=0.0001;, score=0.967 total time= 0.0s [CV 4/5] END ...............C=100, gamma=0.0001;, score=0.969 total time= 0.0s [CV 5/5] END ...............C=100, gamma=0.0001;, score=0.963 total time= 0.0s [CV 1/5] END ...................C=1000, gamma=1;, score=0.967 total time= 0.2s [CV 2/5] END ...................C=1000, gamma=1;, score=0.967 total time= 0.2s [CV 3/5] END ...................C=1000, gamma=1;, score=0.967 total time= 0.2s [CV 4/5] END ...................C=1000, gamma=1;, score=0.969 total time= 0.3s [CV 5/5] END ...................C=1000, gamma=1;, score=0.967 total time= 0.2s [CV 1/5] END .................C=1000, gamma=0.1;, score=0.969 total time= 0.2s [CV 2/5] END .................C=1000, gamma=0.1;, score=0.961 total time= 0.2s [CV 3/5] END .................C=1000, gamma=0.1;, score=0.967 total time= 0.2s [CV 4/5] END .................C=1000, gamma=0.1;, score=0.969 total time= 0.2s [CV 5/5] END .................C=1000, gamma=0.1;, score=0.963 total time= 0.2s [CV 1/5] END ................C=1000, gamma=0.01;, score=0.971 total time= 0.0s [CV 2/5] END ................C=1000, gamma=0.01;, score=0.973 total time= 0.0s [CV 3/5] END ................C=1000, gamma=0.01;, score=0.955 total time= 0.0s [CV 4/5] END ................C=1000, gamma=0.01;, score=0.969 total time= 0.0s [CV 5/5] END ................C=1000, gamma=0.01;, score=0.965 total time= 0.0s [CV 1/5] END ...............C=1000, gamma=0.001;, score=0.979 total time= 0.0s [CV 2/5] END ...............C=1000, gamma=0.001;, score=0.984 total time= 0.0s [CV 3/5] END ...............C=1000, gamma=0.001;, score=0.969 total time= 0.1s [CV 4/5] END ...............C=1000, gamma=0.001;, score=0.977 total time= 0.1s [CV 5/5] END ...............C=1000, gamma=0.001;, score=0.975 total time= 0.0s [CV 1/5] END ..............C=1000, gamma=0.0001;, score=0.977 total time= 0.0s [CV 2/5] END ..............C=1000, gamma=0.0001;, score=0.979 total time= 0.0s [CV 3/5] END ..............C=1000, gamma=0.0001;, score=0.971 total time= 0.0s [CV 4/5] END ..............C=1000, gamma=0.0001;, score=0.981 total time= 0.0s [CV 5/5] END ..............C=1000, gamma=0.0001;, score=0.969 total time= 0.0s
GridSearchCV(estimator=SVC(),
param_grid={'C': [0.1, 1, 10, 100, 1000],
'gamma': [1, 0.1, 0.01, 0.001, 0.0001]},
verbose=3)
# Show the best estimator, the best score, and the best parameter.
print('The best estimator is,',grid.best_estimator_)
print('The best score is,',grid.best_score_)
print('The best parameters are,',grid.best_params_)
The best estimator is, SVC(C=1000, gamma=0.001)
The best score is, 0.9769826180275645
The best parameters are, {'C': 1000, 'gamma': 0.001}
# Define a variable called svm_pred1, predict a model.
svm_pred1 = grid.predict(X_test4)
# Show the classification matrix and confusion matrix.
print(classification_report(y_test4, svm_pred1))
print('\n')
# Show the Confusion Matrix.
cnf_matrix = confusion_matrix(y_test4, svm_pred1)
class_names=[0,1] # name of classes
fig, ax = plt.subplots()
tick_marks = np.arange(len(class_names))
plt.xticks(tick_marks, class_names)
plt.yticks(tick_marks, class_names)
# Use sns.heatmap() function to display the confusion matriix on the heatmap.
sns.heatmap(pd.DataFrame(cnf_matrix), annot=True,
cmap="YlGnBu_r" ,fmt='g')
ax.xaxis.set_label_position("top")
ax.xaxis.set_ticks_position("top")
plt.title('Confusion matrix', y=1.2)
plt.ylabel('Actual label')
plt.xlabel('Predicted label')
plt.tight_layout()
precision recall f1-score support
0 0.99 0.99 0.99 1012
1 0.70 0.59 0.64 32
accuracy 0.98 1044
macro avg 0.85 0.79 0.82 1044
weighted avg 0.98 0.98 0.98 1044
The model correctly predicted both classes of personal loans using the grid search cross-validation approach. Although I used the optimal c and gamma tuning parameters, it is not nearly as spectacular as the random forest.
# Show an accuracy, precision, recall, and f1 score.
svm_acc = metrics.accuracy_score(y_test4, svm_pred1)*100
print("Accuracy:",round(metrics.accuracy_score(y_test4, svm_pred1)*100,3),"%.")
print("Precision:",round(metrics.precision_score(y_test4, svm_pred1)*100,3),"%.")
print("Recall:",round(metrics.recall_score(y_test4, svm_pred1)*100,3),"%.")
print("F1 Score:",round(metrics.f1_score(y_test4, svm_pred1)*100,3),"%.")
Accuracy: 97.989 %. Precision: 70.37 %. Recall: 59.375 %. F1 Score: 64.407 %.
# Compute an error_rate of the support vector machine (SVM) model.
svm_err = round(np.mean(y_test4 != svm_pred1)*100,3)
print("The error rate is",svm_err,"%.")
The error rate is 2.011 %.
# Visualize the support vector classifier (SVC) using optimal support vectors tuning parameters.
# Show splitting the training data.
plt.figure(figsize = (10,7))
svm1 = SVC(C=1000, gamma= 0.001)
svm1 = svm1.fit(X_train4, y_train4)
support_vectors = svm1.support_vectors_
sns.scatterplot(x = X_train4.iloc[:,2], y = X_train4.iloc[:,1], alpha = 0.5,
color = "blue")
plt.scatter(support_vectors[:,0], support_vectors[:,1], color='red')
plt.title('Stacking support vectors on top of training data plot')
plt.xlabel('Annual Income (in $)')
plt.ylabel('Experience (in Years)')
plt.show()
We can display the support vectors and the training set by simply using the Matplotlib library to visualize the training data and stacking the support vectors on top.
Training the Artificial Neural Network (ANN) / deep learning neural network with Stochastic Gradient Descent.
# Define X and y variables
X = credit2.drop('Personal Loan', axis = 1).values
y = credit2['Personal Loan'].values
# Call train_test_split() function to split 30% data to the test set randomly.
X_train5, X_test5, y_train5, y_test5 = train_test_split(
X, y, test_size = 0.3, random_state=101)
# Import a MinMaxScaler from sckit learn library.
from sklearn.preprocessing import MinMaxScaler
# Create an instance scaler of MinMaxScaler().
scaler = MinMaxScaler()
# Transform the data using MinMaxScaler instance.
X_train5 = scaler.fit_transform(X_train5)
X_test5 = scaler.fit_transform(X_test5)
# Import Sequential from tensorflow and keras libraries.
from tensorflow.keras.models import Sequential
# Import Dense from tensorflow and keras libraries.
from tensorflow.keras.layers import Dense
# Create an artificial neural network model.
model = Sequential()
# Add an input and the first hidden layer to the model.
# Use rectified linear unit (relu) an activation function.
# Input dim tells us the number of nodes in the Input Layer.
model.add(Dense(units = 30,input_dim = 11, activation='relu'))# 30 neurons
# Add a second hidden layer to the model.
model.add(Dense(units = 15,activation='relu'))
# Add an output layer to the model.
model.add(Dense(units = 1 ,activation='sigmoid'))
# I used sigmoid because it is a binary classification problem.
# Compile the Artificial neural network (ANN) model.
model.compile(loss = 'binary_crossentropy', optimizer = 'adam', metrics= ['accuracy'])
# Fit a training data to the Artificial neural network (ANN) model.
# 600 epochs means training the neural network with all the training data for 600 cycle.
model.fit(x = X_train5, y = y_train5, epochs = 600,
validation_data = (X_test5, y_test5))
Train on 2433 samples, validate on 1044 samples Epoch 1/600
2021-12-29 07:42:31.732452: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2433/2433 [==============================] - 0s 158us/step - loss: 0.3539 - acc: 0.9005 - val_loss: 0.1757 - val_acc: 0.9693 Epoch 2/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.1677 - acc: 0.9675 - val_loss: 0.1587 - val_acc: 0.9693 Epoch 3/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.1565 - acc: 0.9675 - val_loss: 0.1497 - val_acc: 0.9693 Epoch 4/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.1441 - acc: 0.9675 - val_loss: 0.1385 - val_acc: 0.9693 Epoch 5/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.1307 - acc: 0.9675 - val_loss: 0.1250 - val_acc: 0.9693 Epoch 6/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.1181 - acc: 0.9675 - val_loss: 0.1172 - val_acc: 0.9693 Epoch 7/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.1084 - acc: 0.9675 - val_loss: 0.1063 - val_acc: 0.9693 Epoch 8/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0996 - acc: 0.9675 - val_loss: 0.0992 - val_acc: 0.9693 Epoch 9/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0925 - acc: 0.9684 - val_loss: 0.0925 - val_acc: 0.9684 Epoch 10/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0868 - acc: 0.9688 - val_loss: 0.0870 - val_acc: 0.9684 Epoch 11/600 2433/2433 [==============================] - 0s 38us/step - loss: 0.0815 - acc: 0.9704 - val_loss: 0.0824 - val_acc: 0.9693 Epoch 12/600 2433/2433 [==============================] - 0s 38us/step - loss: 0.0777 - acc: 0.9721 - val_loss: 0.0789 - val_acc: 0.9684 Epoch 13/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0820 - acc: 0.9741 - val_loss: 0.0805 - val_acc: 0.9655 Epoch 14/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0743 - acc: 0.9737 - val_loss: 0.0772 - val_acc: 0.9665 Epoch 15/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0714 - acc: 0.9753 - val_loss: 0.0739 - val_acc: 0.9693 Epoch 16/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0687 - acc: 0.9758 - val_loss: 0.0718 - val_acc: 0.9693 Epoch 17/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0670 - acc: 0.9774 - val_loss: 0.0697 - val_acc: 0.9713 Epoch 18/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0654 - acc: 0.9778 - val_loss: 0.0700 - val_acc: 0.9722 Epoch 19/600 2433/2433 [==============================] - 0s 39us/step - loss: 0.0640 - acc: 0.9774 - val_loss: 0.0673 - val_acc: 0.9741 Epoch 20/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0618 - acc: 0.9794 - val_loss: 0.0656 - val_acc: 0.9741 Epoch 21/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0603 - acc: 0.9799 - val_loss: 0.0647 - val_acc: 0.9741 Epoch 22/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0591 - acc: 0.9807 - val_loss: 0.0637 - val_acc: 0.9751 Epoch 23/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0573 - acc: 0.9815 - val_loss: 0.0633 - val_acc: 0.9761 Epoch 24/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0553 - acc: 0.9819 - val_loss: 0.0768 - val_acc: 0.9722 Epoch 25/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0562 - acc: 0.9827 - val_loss: 0.0642 - val_acc: 0.9789 Epoch 26/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0562 - acc: 0.9815 - val_loss: 0.0613 - val_acc: 0.9789 Epoch 27/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0535 - acc: 0.9827 - val_loss: 0.0670 - val_acc: 0.9732 Epoch 28/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0536 - acc: 0.9811 - val_loss: 0.0611 - val_acc: 0.9799 Epoch 29/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0516 - acc: 0.9823 - val_loss: 0.0612 - val_acc: 0.9780 Epoch 30/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0507 - acc: 0.9819 - val_loss: 0.0593 - val_acc: 0.9799 Epoch 31/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0505 - acc: 0.9819 - val_loss: 0.0612 - val_acc: 0.9770 Epoch 32/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0523 - acc: 0.9815 - val_loss: 0.0602 - val_acc: 0.9780 Epoch 33/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0496 - acc: 0.9827 - val_loss: 0.0708 - val_acc: 0.9713 Epoch 34/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0499 - acc: 0.9823 - val_loss: 0.0597 - val_acc: 0.9780 Epoch 35/600 2433/2433 [==============================] - 0s 38us/step - loss: 0.0481 - acc: 0.9823 - val_loss: 0.0613 - val_acc: 0.9732 Epoch 36/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0477 - acc: 0.9844 - val_loss: 0.0584 - val_acc: 0.9780 Epoch 37/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0477 - acc: 0.9823 - val_loss: 0.0609 - val_acc: 0.9751 Epoch 38/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0489 - acc: 0.9823 - val_loss: 0.0589 - val_acc: 0.9808 Epoch 39/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0476 - acc: 0.9827 - val_loss: 0.0597 - val_acc: 0.9770 Epoch 40/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0467 - acc: 0.9831 - val_loss: 0.0588 - val_acc: 0.9761 Epoch 41/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0455 - acc: 0.9836 - val_loss: 0.0632 - val_acc: 0.9741 Epoch 42/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0454 - acc: 0.9844 - val_loss: 0.0582 - val_acc: 0.9808 Epoch 43/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0451 - acc: 0.9844 - val_loss: 0.0580 - val_acc: 0.9808 Epoch 44/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0453 - acc: 0.9840 - val_loss: 0.0589 - val_acc: 0.9780 Epoch 45/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0446 - acc: 0.9860 - val_loss: 0.0586 - val_acc: 0.9770 Epoch 46/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0440 - acc: 0.9844 - val_loss: 0.0645 - val_acc: 0.9713 Epoch 47/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0447 - acc: 0.9840 - val_loss: 0.0645 - val_acc: 0.9713 Epoch 48/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0444 - acc: 0.9836 - val_loss: 0.0593 - val_acc: 0.9770 Epoch 49/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0540 - acc: 0.9815 - val_loss: 0.0622 - val_acc: 0.9789 Epoch 50/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0448 - acc: 0.9840 - val_loss: 0.0610 - val_acc: 0.9761 Epoch 51/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0433 - acc: 0.9852 - val_loss: 0.0599 - val_acc: 0.9770 Epoch 52/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0435 - acc: 0.9860 - val_loss: 0.0586 - val_acc: 0.9770 Epoch 53/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0429 - acc: 0.9860 - val_loss: 0.0641 - val_acc: 0.9713 Epoch 54/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0429 - acc: 0.9844 - val_loss: 0.0614 - val_acc: 0.9741 Epoch 55/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0426 - acc: 0.9852 - val_loss: 0.0586 - val_acc: 0.9770 Epoch 56/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0424 - acc: 0.9852 - val_loss: 0.0591 - val_acc: 0.9761 Epoch 57/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0428 - acc: 0.9848 - val_loss: 0.0608 - val_acc: 0.9751 Epoch 58/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0415 - acc: 0.9860 - val_loss: 0.0582 - val_acc: 0.9761 Epoch 59/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0423 - acc: 0.9844 - val_loss: 0.0716 - val_acc: 0.9665 Epoch 60/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0419 - acc: 0.9852 - val_loss: 0.0584 - val_acc: 0.9770 Epoch 61/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0416 - acc: 0.9840 - val_loss: 0.0618 - val_acc: 0.9732 Epoch 62/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0421 - acc: 0.9860 - val_loss: 0.0617 - val_acc: 0.9732 Epoch 63/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0417 - acc: 0.9852 - val_loss: 0.0662 - val_acc: 0.9722 Epoch 64/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0420 - acc: 0.9848 - val_loss: 0.0619 - val_acc: 0.9732 Epoch 65/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0440 - acc: 0.9831 - val_loss: 0.0611 - val_acc: 0.9751 Epoch 66/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0424 - acc: 0.9848 - val_loss: 0.0614 - val_acc: 0.9741 Epoch 67/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0404 - acc: 0.9856 - val_loss: 0.0601 - val_acc: 0.9761 Epoch 68/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0425 - acc: 0.9844 - val_loss: 0.0600 - val_acc: 0.9761 Epoch 69/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0411 - acc: 0.9856 - val_loss: 0.0607 - val_acc: 0.9732 Epoch 70/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0421 - acc: 0.9848 - val_loss: 0.0592 - val_acc: 0.9789 Epoch 71/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0408 - acc: 0.9864 - val_loss: 0.0587 - val_acc: 0.9789 Epoch 72/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0399 - acc: 0.9860 - val_loss: 0.0600 - val_acc: 0.9761 Epoch 73/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0419 - acc: 0.9860 - val_loss: 0.0586 - val_acc: 0.9770 Epoch 74/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0412 - acc: 0.9840 - val_loss: 0.0632 - val_acc: 0.9722 Epoch 75/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0564 - acc: 0.9782 - val_loss: 0.0598 - val_acc: 0.9780 Epoch 76/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0421 - acc: 0.9852 - val_loss: 0.0587 - val_acc: 0.9780 Epoch 77/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0412 - acc: 0.9852 - val_loss: 0.0582 - val_acc: 0.9780 Epoch 78/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0410 - acc: 0.9848 - val_loss: 0.0577 - val_acc: 0.9789 Epoch 79/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0471 - acc: 0.9836 - val_loss: 0.0595 - val_acc: 0.9732 Epoch 80/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0402 - acc: 0.9844 - val_loss: 0.0603 - val_acc: 0.9722 Epoch 81/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0396 - acc: 0.9856 - val_loss: 0.0633 - val_acc: 0.9722 Epoch 82/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0384 - acc: 0.9856 - val_loss: 0.0581 - val_acc: 0.9789 Epoch 83/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0398 - acc: 0.9848 - val_loss: 0.0614 - val_acc: 0.9722 Epoch 84/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0400 - acc: 0.9860 - val_loss: 0.0603 - val_acc: 0.9751 Epoch 85/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0393 - acc: 0.9864 - val_loss: 0.0609 - val_acc: 0.9751 Epoch 86/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0403 - acc: 0.9856 - val_loss: 0.0625 - val_acc: 0.9722 Epoch 87/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0386 - acc: 0.9848 - val_loss: 0.0598 - val_acc: 0.9761 Epoch 88/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0384 - acc: 0.9860 - val_loss: 0.0596 - val_acc: 0.9770 Epoch 89/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0381 - acc: 0.9864 - val_loss: 0.0606 - val_acc: 0.9751 Epoch 90/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0382 - acc: 0.9856 - val_loss: 0.0592 - val_acc: 0.9789 Epoch 91/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0391 - acc: 0.9868 - val_loss: 0.0650 - val_acc: 0.9722 Epoch 92/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0392 - acc: 0.9844 - val_loss: 0.0645 - val_acc: 0.9722 Epoch 93/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0374 - acc: 0.9864 - val_loss: 0.0601 - val_acc: 0.9761 Epoch 94/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0377 - acc: 0.9873 - val_loss: 0.0629 - val_acc: 0.9722 Epoch 95/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0371 - acc: 0.9860 - val_loss: 0.0613 - val_acc: 0.9741 Epoch 96/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0366 - acc: 0.9856 - val_loss: 0.0588 - val_acc: 0.9780 Epoch 97/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0389 - acc: 0.9868 - val_loss: 0.0620 - val_acc: 0.9732 Epoch 98/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0370 - acc: 0.9864 - val_loss: 0.0640 - val_acc: 0.9722 Epoch 99/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0376 - acc: 0.9856 - val_loss: 0.0610 - val_acc: 0.9741 Epoch 100/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0374 - acc: 0.9848 - val_loss: 0.0643 - val_acc: 0.9741 Epoch 101/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0366 - acc: 0.9873 - val_loss: 0.0594 - val_acc: 0.9780 Epoch 102/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0367 - acc: 0.9860 - val_loss: 0.0647 - val_acc: 0.9722 Epoch 103/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0359 - acc: 0.9864 - val_loss: 0.0614 - val_acc: 0.9741 Epoch 104/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0364 - acc: 0.9860 - val_loss: 0.0602 - val_acc: 0.9751 Epoch 105/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0361 - acc: 0.9868 - val_loss: 0.0625 - val_acc: 0.9751 Epoch 106/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0363 - acc: 0.9873 - val_loss: 0.0596 - val_acc: 0.9770 Epoch 107/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0366 - acc: 0.9856 - val_loss: 0.0652 - val_acc: 0.9713 Epoch 108/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0355 - acc: 0.9877 - val_loss: 0.0619 - val_acc: 0.9741 Epoch 109/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0364 - acc: 0.9864 - val_loss: 0.0687 - val_acc: 0.9713 Epoch 110/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0375 - acc: 0.9885 - val_loss: 0.0639 - val_acc: 0.9741 Epoch 111/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0370 - acc: 0.9873 - val_loss: 0.0653 - val_acc: 0.9722 Epoch 112/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0349 - acc: 0.9877 - val_loss: 0.0630 - val_acc: 0.9732 Epoch 113/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0363 - acc: 0.9864 - val_loss: 0.0621 - val_acc: 0.9732 Epoch 114/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0349 - acc: 0.9877 - val_loss: 0.0632 - val_acc: 0.9732 Epoch 115/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0347 - acc: 0.9864 - val_loss: 0.0644 - val_acc: 0.9722 Epoch 116/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0351 - acc: 0.9873 - val_loss: 0.0627 - val_acc: 0.9732 Epoch 117/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0350 - acc: 0.9889 - val_loss: 0.0674 - val_acc: 0.9732 Epoch 118/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0361 - acc: 0.9864 - val_loss: 0.0609 - val_acc: 0.9770 Epoch 119/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0358 - acc: 0.9881 - val_loss: 0.0725 - val_acc: 0.9703 Epoch 120/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0343 - acc: 0.9877 - val_loss: 0.0600 - val_acc: 0.9780 Epoch 121/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0346 - acc: 0.9881 - val_loss: 0.0600 - val_acc: 0.9751 Epoch 122/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0345 - acc: 0.9885 - val_loss: 0.0654 - val_acc: 0.9713 Epoch 123/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0335 - acc: 0.9877 - val_loss: 0.0657 - val_acc: 0.9713 Epoch 124/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0349 - acc: 0.9889 - val_loss: 0.0633 - val_acc: 0.9732 Epoch 125/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0338 - acc: 0.9864 - val_loss: 0.0621 - val_acc: 0.9741 Epoch 126/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0340 - acc: 0.9877 - val_loss: 0.0609 - val_acc: 0.9741 Epoch 127/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0337 - acc: 0.9889 - val_loss: 0.0608 - val_acc: 0.9741 Epoch 128/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0339 - acc: 0.9856 - val_loss: 0.0649 - val_acc: 0.9713 Epoch 129/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0329 - acc: 0.9885 - val_loss: 0.0691 - val_acc: 0.9713 Epoch 130/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0335 - acc: 0.9877 - val_loss: 0.0630 - val_acc: 0.9741 Epoch 131/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0335 - acc: 0.9881 - val_loss: 0.0744 - val_acc: 0.9693 Epoch 132/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0333 - acc: 0.9877 - val_loss: 0.0700 - val_acc: 0.9713 Epoch 133/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0322 - acc: 0.9885 - val_loss: 0.0685 - val_acc: 0.9722 Epoch 134/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0335 - acc: 0.9873 - val_loss: 0.0620 - val_acc: 0.9751 Epoch 135/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0330 - acc: 0.9881 - val_loss: 0.0645 - val_acc: 0.9713 Epoch 136/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0316 - acc: 0.9893 - val_loss: 0.0708 - val_acc: 0.9713 Epoch 137/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0317 - acc: 0.9901 - val_loss: 0.0649 - val_acc: 0.9732 Epoch 138/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0327 - acc: 0.9877 - val_loss: 0.0618 - val_acc: 0.9761 Epoch 139/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0325 - acc: 0.9885 - val_loss: 0.0704 - val_acc: 0.9713 Epoch 140/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0328 - acc: 0.9889 - val_loss: 0.0727 - val_acc: 0.9703 Epoch 141/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0322 - acc: 0.9881 - val_loss: 0.0656 - val_acc: 0.9732 Epoch 142/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0345 - acc: 0.9868 - val_loss: 0.0700 - val_acc: 0.9713 Epoch 143/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0327 - acc: 0.9901 - val_loss: 0.0619 - val_acc: 0.9751 Epoch 144/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0306 - acc: 0.9905 - val_loss: 0.0706 - val_acc: 0.9703 Epoch 145/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0322 - acc: 0.9877 - val_loss: 0.0741 - val_acc: 0.9693 Epoch 146/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0311 - acc: 0.9893 - val_loss: 0.0633 - val_acc: 0.9741 Epoch 147/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0322 - acc: 0.9885 - val_loss: 0.0657 - val_acc: 0.9732 Epoch 148/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0308 - acc: 0.9901 - val_loss: 0.0672 - val_acc: 0.9713 Epoch 149/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0302 - acc: 0.9901 - val_loss: 0.0619 - val_acc: 0.9780 Epoch 150/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0331 - acc: 0.9885 - val_loss: 0.0753 - val_acc: 0.9693 Epoch 151/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0315 - acc: 0.9881 - val_loss: 0.0630 - val_acc: 0.9741 Epoch 152/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0332 - acc: 0.9873 - val_loss: 0.0632 - val_acc: 0.9751 Epoch 153/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0316 - acc: 0.9893 - val_loss: 0.0672 - val_acc: 0.9732 Epoch 154/600 2433/2433 [==============================] - 0s 38us/step - loss: 0.0300 - acc: 0.9893 - val_loss: 0.0709 - val_acc: 0.9713 Epoch 155/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0387 - acc: 0.9864 - val_loss: 0.0727 - val_acc: 0.9703 Epoch 156/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0325 - acc: 0.9889 - val_loss: 0.0659 - val_acc: 0.9732 Epoch 157/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0329 - acc: 0.9877 - val_loss: 0.0661 - val_acc: 0.9751 Epoch 158/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0310 - acc: 0.9881 - val_loss: 0.0665 - val_acc: 0.9732 Epoch 159/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0296 - acc: 0.9901 - val_loss: 0.0704 - val_acc: 0.9713 Epoch 160/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0294 - acc: 0.9897 - val_loss: 0.0841 - val_acc: 0.9693 Epoch 161/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0308 - acc: 0.9897 - val_loss: 0.0701 - val_acc: 0.9703 Epoch 162/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0300 - acc: 0.9905 - val_loss: 0.0812 - val_acc: 0.9684 Epoch 163/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0304 - acc: 0.9910 - val_loss: 0.0748 - val_acc: 0.9713 Epoch 164/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0301 - acc: 0.9914 - val_loss: 0.0702 - val_acc: 0.9722 Epoch 165/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0295 - acc: 0.9901 - val_loss: 0.0669 - val_acc: 0.9722 Epoch 166/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0296 - acc: 0.9910 - val_loss: 0.0693 - val_acc: 0.9732 Epoch 167/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0300 - acc: 0.9914 - val_loss: 0.0660 - val_acc: 0.9741 Epoch 168/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0290 - acc: 0.9897 - val_loss: 0.0690 - val_acc: 0.9732 Epoch 169/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0283 - acc: 0.9922 - val_loss: 0.0776 - val_acc: 0.9684 Epoch 170/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0284 - acc: 0.9901 - val_loss: 0.0762 - val_acc: 0.9693 Epoch 171/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0281 - acc: 0.9901 - val_loss: 0.0678 - val_acc: 0.9741 Epoch 172/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0283 - acc: 0.9901 - val_loss: 0.0712 - val_acc: 0.9722 Epoch 173/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0298 - acc: 0.9885 - val_loss: 0.0696 - val_acc: 0.9722 Epoch 174/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0283 - acc: 0.9897 - val_loss: 0.0738 - val_acc: 0.9732 Epoch 175/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0287 - acc: 0.9914 - val_loss: 0.0746 - val_acc: 0.9713 Epoch 176/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0303 - acc: 0.9893 - val_loss: 0.0667 - val_acc: 0.9713 Epoch 177/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0285 - acc: 0.9901 - val_loss: 0.0691 - val_acc: 0.9732 Epoch 178/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0278 - acc: 0.9910 - val_loss: 0.0759 - val_acc: 0.9693 Epoch 179/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0296 - acc: 0.9889 - val_loss: 0.0784 - val_acc: 0.9684 Epoch 180/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0278 - acc: 0.9905 - val_loss: 0.0802 - val_acc: 0.9703 Epoch 181/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0281 - acc: 0.9901 - val_loss: 0.0698 - val_acc: 0.9732 Epoch 182/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0279 - acc: 0.9910 - val_loss: 0.0877 - val_acc: 0.9693 Epoch 183/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0287 - acc: 0.9889 - val_loss: 0.0669 - val_acc: 0.9722 Epoch 184/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0280 - acc: 0.9910 - val_loss: 0.0708 - val_acc: 0.9693 Epoch 185/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0273 - acc: 0.9918 - val_loss: 0.0793 - val_acc: 0.9703 Epoch 186/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0288 - acc: 0.9897 - val_loss: 0.0680 - val_acc: 0.9732 Epoch 187/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0277 - acc: 0.9910 - val_loss: 0.0794 - val_acc: 0.9693 Epoch 188/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0374 - acc: 0.9852 - val_loss: 0.0806 - val_acc: 0.9703 Epoch 189/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0291 - acc: 0.9897 - val_loss: 0.0806 - val_acc: 0.9703 Epoch 190/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0279 - acc: 0.9905 - val_loss: 0.0726 - val_acc: 0.9693 Epoch 191/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0265 - acc: 0.9910 - val_loss: 0.0704 - val_acc: 0.9713 Epoch 192/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0270 - acc: 0.9901 - val_loss: 0.0684 - val_acc: 0.9722 Epoch 193/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0274 - acc: 0.9901 - val_loss: 0.0747 - val_acc: 0.9713 Epoch 194/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0276 - acc: 0.9910 - val_loss: 0.0686 - val_acc: 0.9703 Epoch 195/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0257 - acc: 0.9918 - val_loss: 0.0747 - val_acc: 0.9713 Epoch 196/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0274 - acc: 0.9897 - val_loss: 0.0765 - val_acc: 0.9722 Epoch 197/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0265 - acc: 0.9922 - val_loss: 0.0687 - val_acc: 0.9732 Epoch 198/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0261 - acc: 0.9910 - val_loss: 0.0746 - val_acc: 0.9722 Epoch 199/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0275 - acc: 0.9910 - val_loss: 0.0733 - val_acc: 0.9713 Epoch 200/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0266 - acc: 0.9905 - val_loss: 0.0705 - val_acc: 0.9713 Epoch 201/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0263 - acc: 0.9914 - val_loss: 0.0723 - val_acc: 0.9722 Epoch 202/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0257 - acc: 0.9922 - val_loss: 0.0750 - val_acc: 0.9722 Epoch 203/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0261 - acc: 0.9926 - val_loss: 0.0687 - val_acc: 0.9722 Epoch 204/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0265 - acc: 0.9910 - val_loss: 0.0824 - val_acc: 0.9684 Epoch 205/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0279 - acc: 0.9901 - val_loss: 0.0745 - val_acc: 0.9713 Epoch 206/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0258 - acc: 0.9922 - val_loss: 0.0766 - val_acc: 0.9713 Epoch 207/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0254 - acc: 0.9914 - val_loss: 0.0964 - val_acc: 0.9684 Epoch 208/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0262 - acc: 0.9922 - val_loss: 0.0738 - val_acc: 0.9703 Epoch 209/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0258 - acc: 0.9918 - val_loss: 0.0778 - val_acc: 0.9693 Epoch 210/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0251 - acc: 0.9930 - val_loss: 0.0765 - val_acc: 0.9713 Epoch 211/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0249 - acc: 0.9922 - val_loss: 0.0795 - val_acc: 0.9684 Epoch 212/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0270 - acc: 0.9918 - val_loss: 0.0798 - val_acc: 0.9703 Epoch 213/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0268 - acc: 0.9914 - val_loss: 0.0745 - val_acc: 0.9713 Epoch 214/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0276 - acc: 0.9910 - val_loss: 0.0710 - val_acc: 0.9722 Epoch 215/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0256 - acc: 0.9905 - val_loss: 0.0799 - val_acc: 0.9684 Epoch 216/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0261 - acc: 0.9930 - val_loss: 0.0668 - val_acc: 0.9751 Epoch 217/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0249 - acc: 0.9926 - val_loss: 0.0821 - val_acc: 0.9703 Epoch 218/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0251 - acc: 0.9918 - val_loss: 0.0722 - val_acc: 0.9722 Epoch 219/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0255 - acc: 0.9930 - val_loss: 0.0764 - val_acc: 0.9693 Epoch 220/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0258 - acc: 0.9922 - val_loss: 0.0905 - val_acc: 0.9693 Epoch 221/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0257 - acc: 0.9922 - val_loss: 0.0709 - val_acc: 0.9751 Epoch 222/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0256 - acc: 0.9922 - val_loss: 0.0817 - val_acc: 0.9684 Epoch 223/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0296 - acc: 0.9910 - val_loss: 0.0765 - val_acc: 0.9713 Epoch 224/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0252 - acc: 0.9918 - val_loss: 0.0700 - val_acc: 0.9751 Epoch 225/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0245 - acc: 0.9926 - val_loss: 0.0799 - val_acc: 0.9684 Epoch 226/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0246 - acc: 0.9918 - val_loss: 0.0770 - val_acc: 0.9722 Epoch 227/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0243 - acc: 0.9918 - val_loss: 0.0741 - val_acc: 0.9722 Epoch 228/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0239 - acc: 0.9922 - val_loss: 0.0830 - val_acc: 0.9674 Epoch 229/600 2433/2433 [==============================] - 0s 38us/step - loss: 0.0236 - acc: 0.9922 - val_loss: 0.0785 - val_acc: 0.9732 Epoch 230/600 2433/2433 [==============================] - 0s 39us/step - loss: 0.0238 - acc: 0.9922 - val_loss: 0.0753 - val_acc: 0.9713 Epoch 231/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0239 - acc: 0.9926 - val_loss: 0.0734 - val_acc: 0.9751 Epoch 232/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0239 - acc: 0.9922 - val_loss: 0.0796 - val_acc: 0.9703 Epoch 233/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0231 - acc: 0.9930 - val_loss: 0.0734 - val_acc: 0.9722 Epoch 234/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0248 - acc: 0.9922 - val_loss: 0.0763 - val_acc: 0.9722 Epoch 235/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0241 - acc: 0.9922 - val_loss: 0.0880 - val_acc: 0.9684 Epoch 236/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0246 - acc: 0.9918 - val_loss: 0.0881 - val_acc: 0.9693 Epoch 237/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0242 - acc: 0.9926 - val_loss: 0.0774 - val_acc: 0.9713 Epoch 238/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0247 - acc: 0.9914 - val_loss: 0.0750 - val_acc: 0.9732 Epoch 239/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0233 - acc: 0.9926 - val_loss: 0.0829 - val_acc: 0.9684 Epoch 240/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0241 - acc: 0.9918 - val_loss: 0.0800 - val_acc: 0.9703 Epoch 241/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0228 - acc: 0.9922 - val_loss: 0.0780 - val_acc: 0.9703 Epoch 242/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0237 - acc: 0.9918 - val_loss: 0.0930 - val_acc: 0.9674 Epoch 243/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0249 - acc: 0.9926 - val_loss: 0.0779 - val_acc: 0.9713 Epoch 244/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0241 - acc: 0.9910 - val_loss: 0.0835 - val_acc: 0.9665 Epoch 245/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0240 - acc: 0.9918 - val_loss: 0.0874 - val_acc: 0.9674 Epoch 246/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0240 - acc: 0.9926 - val_loss: 0.0930 - val_acc: 0.9684 Epoch 247/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0242 - acc: 0.9922 - val_loss: 0.0768 - val_acc: 0.9722 Epoch 248/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0479 - acc: 0.9844 - val_loss: 0.0835 - val_acc: 0.9665 Epoch 249/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0238 - acc: 0.9926 - val_loss: 0.0765 - val_acc: 0.9722 Epoch 250/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0233 - acc: 0.9922 - val_loss: 0.0832 - val_acc: 0.9674 Epoch 251/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0227 - acc: 0.9934 - val_loss: 0.0780 - val_acc: 0.9722 Epoch 252/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0228 - acc: 0.9930 - val_loss: 0.0824 - val_acc: 0.9693 Epoch 253/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0223 - acc: 0.9926 - val_loss: 0.0742 - val_acc: 0.9741 Epoch 254/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0248 - acc: 0.9922 - val_loss: 0.0761 - val_acc: 0.9741 Epoch 255/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0231 - acc: 0.9914 - val_loss: 0.0775 - val_acc: 0.9713 Epoch 256/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0228 - acc: 0.9922 - val_loss: 0.0958 - val_acc: 0.9684 Epoch 257/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0228 - acc: 0.9926 - val_loss: 0.0788 - val_acc: 0.9703 Epoch 258/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0236 - acc: 0.9914 - val_loss: 0.0979 - val_acc: 0.9693 Epoch 259/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0220 - acc: 0.9934 - val_loss: 0.0858 - val_acc: 0.9693 Epoch 260/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0223 - acc: 0.9926 - val_loss: 0.0789 - val_acc: 0.9741 Epoch 261/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0217 - acc: 0.9922 - val_loss: 0.0983 - val_acc: 0.9693 Epoch 262/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0234 - acc: 0.9910 - val_loss: 0.0813 - val_acc: 0.9732 Epoch 263/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0319 - acc: 0.9877 - val_loss: 0.0870 - val_acc: 0.9713 Epoch 264/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0234 - acc: 0.9905 - val_loss: 0.0904 - val_acc: 0.9693 Epoch 265/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0214 - acc: 0.9922 - val_loss: 0.0801 - val_acc: 0.9703 Epoch 266/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0915 - val_acc: 0.9684 Epoch 267/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0218 - acc: 0.9934 - val_loss: 0.0994 - val_acc: 0.9674 Epoch 268/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0250 - acc: 0.9914 - val_loss: 0.0910 - val_acc: 0.9684 Epoch 269/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0214 - acc: 0.9930 - val_loss: 0.0866 - val_acc: 0.9703 Epoch 270/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0211 - acc: 0.9930 - val_loss: 0.0884 - val_acc: 0.9684 Epoch 271/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0214 - acc: 0.9930 - val_loss: 0.0843 - val_acc: 0.9693 Epoch 272/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0204 - acc: 0.9930 - val_loss: 0.0838 - val_acc: 0.9713 Epoch 273/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0197 - acc: 0.9934 - val_loss: 0.0771 - val_acc: 0.9761 Epoch 274/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0215 - acc: 0.9910 - val_loss: 0.0848 - val_acc: 0.9693 Epoch 275/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0206 - acc: 0.9934 - val_loss: 0.0859 - val_acc: 0.9693 Epoch 276/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0197 - acc: 0.9938 - val_loss: 0.0844 - val_acc: 0.9713 Epoch 277/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0209 - acc: 0.9930 - val_loss: 0.0956 - val_acc: 0.9674 Epoch 278/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0207 - acc: 0.9926 - val_loss: 0.0910 - val_acc: 0.9674 Epoch 279/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0214 - acc: 0.9934 - val_loss: 0.0873 - val_acc: 0.9722 Epoch 280/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0205 - acc: 0.9934 - val_loss: 0.0802 - val_acc: 0.9751 Epoch 281/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0248 - acc: 0.9901 - val_loss: 0.0851 - val_acc: 0.9722 Epoch 282/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0198 - acc: 0.9930 - val_loss: 0.0841 - val_acc: 0.9722 Epoch 283/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0281 - acc: 0.9893 - val_loss: 0.0853 - val_acc: 0.9713 Epoch 284/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0199 - acc: 0.9930 - val_loss: 0.0810 - val_acc: 0.9741 Epoch 285/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0219 - acc: 0.9918 - val_loss: 0.0783 - val_acc: 0.9751 Epoch 286/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0203 - acc: 0.9942 - val_loss: 0.1042 - val_acc: 0.9665 Epoch 287/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0196 - acc: 0.9926 - val_loss: 0.0824 - val_acc: 0.9732 Epoch 288/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0206 - acc: 0.9942 - val_loss: 0.0880 - val_acc: 0.9693 Epoch 289/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0194 - acc: 0.9938 - val_loss: 0.0939 - val_acc: 0.9665 Epoch 290/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0205 - acc: 0.9938 - val_loss: 0.0839 - val_acc: 0.9722 Epoch 291/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0200 - acc: 0.9926 - val_loss: 0.0834 - val_acc: 0.9722 Epoch 292/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0186 - acc: 0.9942 - val_loss: 0.0826 - val_acc: 0.9741 Epoch 293/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0190 - acc: 0.9942 - val_loss: 0.0941 - val_acc: 0.9684 Epoch 294/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0201 - acc: 0.9930 - val_loss: 0.0871 - val_acc: 0.9713 Epoch 295/600 2433/2433 [==============================] - 0s 37us/step - loss: 0.0210 - acc: 0.9918 - val_loss: 0.0859 - val_acc: 0.9751 Epoch 296/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0211 - acc: 0.9922 - val_loss: 0.0952 - val_acc: 0.9674 Epoch 297/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0189 - acc: 0.9938 - val_loss: 0.0846 - val_acc: 0.9732 Epoch 298/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0193 - acc: 0.9938 - val_loss: 0.0966 - val_acc: 0.9665 Epoch 299/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0191 - acc: 0.9926 - val_loss: 0.1093 - val_acc: 0.9674 Epoch 300/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0176 - acc: 0.9942 - val_loss: 0.0881 - val_acc: 0.9703 Epoch 301/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0182 - acc: 0.9938 - val_loss: 0.0999 - val_acc: 0.9665 Epoch 302/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0196 - acc: 0.9918 - val_loss: 0.0921 - val_acc: 0.9693 Epoch 303/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0194 - acc: 0.9934 - val_loss: 0.0842 - val_acc: 0.9751 Epoch 304/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0194 - acc: 0.9934 - val_loss: 0.1005 - val_acc: 0.9684 Epoch 305/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0188 - acc: 0.9930 - val_loss: 0.0897 - val_acc: 0.9693 Epoch 306/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0196 - acc: 0.9922 - val_loss: 0.0868 - val_acc: 0.9713 Epoch 307/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0188 - acc: 0.9930 - val_loss: 0.0947 - val_acc: 0.9674 Epoch 308/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0180 - acc: 0.9934 - val_loss: 0.0899 - val_acc: 0.9703 Epoch 309/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0196 - acc: 0.9914 - val_loss: 0.0931 - val_acc: 0.9732 Epoch 310/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0189 - acc: 0.9930 - val_loss: 0.1004 - val_acc: 0.9665 Epoch 311/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0186 - acc: 0.9930 - val_loss: 0.0993 - val_acc: 0.9684 Epoch 312/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0181 - acc: 0.9938 - val_loss: 0.1109 - val_acc: 0.9665 Epoch 313/600 2433/2433 [==============================] - 0s 41us/step - loss: 0.0182 - acc: 0.9938 - val_loss: 0.0957 - val_acc: 0.9684 Epoch 314/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0193 - acc: 0.9905 - val_loss: 0.0964 - val_acc: 0.9713 Epoch 315/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0172 - acc: 0.9930 - val_loss: 0.0961 - val_acc: 0.9665 Epoch 316/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0169 - acc: 0.9938 - val_loss: 0.0922 - val_acc: 0.9732 Epoch 317/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0183 - acc: 0.9938 - val_loss: 0.1006 - val_acc: 0.9684 Epoch 318/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0169 - acc: 0.9947 - val_loss: 0.1056 - val_acc: 0.9665 Epoch 319/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0175 - acc: 0.9947 - val_loss: 0.0985 - val_acc: 0.9693 Epoch 320/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0174 - acc: 0.9938 - val_loss: 0.0984 - val_acc: 0.9665 Epoch 321/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0165 - acc: 0.9934 - val_loss: 0.0973 - val_acc: 0.9693 Epoch 322/600 2433/2433 [==============================] - 0s 38us/step - loss: 0.0175 - acc: 0.9930 - val_loss: 0.0950 - val_acc: 0.9741 Epoch 323/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0168 - acc: 0.9942 - val_loss: 0.1041 - val_acc: 0.9684 Epoch 324/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0178 - acc: 0.9934 - val_loss: 0.0954 - val_acc: 0.9703 Epoch 325/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0183 - acc: 0.9942 - val_loss: 0.0965 - val_acc: 0.9713 Epoch 326/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0171 - acc: 0.9951 - val_loss: 0.0976 - val_acc: 0.9684 Epoch 327/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0161 - acc: 0.9938 - val_loss: 0.1198 - val_acc: 0.9626 Epoch 328/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0171 - acc: 0.9938 - val_loss: 0.0929 - val_acc: 0.9713 Epoch 329/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0158 - acc: 0.9947 - val_loss: 0.0999 - val_acc: 0.9693 Epoch 330/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0169 - acc: 0.9922 - val_loss: 0.0918 - val_acc: 0.9713 Epoch 331/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0184 - acc: 0.9930 - val_loss: 0.0943 - val_acc: 0.9703 Epoch 332/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0152 - acc: 0.9938 - val_loss: 0.1028 - val_acc: 0.9655 Epoch 333/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0172 - acc: 0.9942 - val_loss: 0.1078 - val_acc: 0.9684 Epoch 334/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0165 - acc: 0.9942 - val_loss: 0.0996 - val_acc: 0.9693 Epoch 335/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0161 - acc: 0.9951 - val_loss: 0.0968 - val_acc: 0.9703 Epoch 336/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0152 - acc: 0.9955 - val_loss: 0.1052 - val_acc: 0.9693 Epoch 337/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0178 - acc: 0.9947 - val_loss: 0.1003 - val_acc: 0.9693 Epoch 338/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0165 - acc: 0.9951 - val_loss: 0.1030 - val_acc: 0.9674 Epoch 339/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0185 - acc: 0.9942 - val_loss: 0.1051 - val_acc: 0.9693 Epoch 340/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0167 - acc: 0.9947 - val_loss: 0.0994 - val_acc: 0.9693 Epoch 341/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0148 - acc: 0.9942 - val_loss: 0.1291 - val_acc: 0.9617 Epoch 342/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0402 - acc: 0.9827 - val_loss: 0.0946 - val_acc: 0.9693 Epoch 343/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0197 - acc: 0.9926 - val_loss: 0.0959 - val_acc: 0.9684 Epoch 344/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0168 - acc: 0.9942 - val_loss: 0.0927 - val_acc: 0.9713 Epoch 345/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0154 - acc: 0.9955 - val_loss: 0.1082 - val_acc: 0.9665 Epoch 346/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0155 - acc: 0.9947 - val_loss: 0.1007 - val_acc: 0.9703 Epoch 347/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0142 - acc: 0.9967 - val_loss: 0.0984 - val_acc: 0.9693 Epoch 348/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0151 - acc: 0.9934 - val_loss: 0.0944 - val_acc: 0.9741 Epoch 349/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0155 - acc: 0.9942 - val_loss: 0.1005 - val_acc: 0.9693 Epoch 350/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0145 - acc: 0.9955 - val_loss: 0.0998 - val_acc: 0.9703 Epoch 351/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0139 - acc: 0.9959 - val_loss: 0.1000 - val_acc: 0.9703 Epoch 352/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0137 - acc: 0.9947 - val_loss: 0.1013 - val_acc: 0.9703 Epoch 353/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0132 - acc: 0.9963 - val_loss: 0.0992 - val_acc: 0.9713 Epoch 354/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0142 - acc: 0.9947 - val_loss: 0.1034 - val_acc: 0.9722 Epoch 355/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0143 - acc: 0.9938 - val_loss: 0.1027 - val_acc: 0.9693 Epoch 356/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0138 - acc: 0.9951 - val_loss: 0.1039 - val_acc: 0.9703 Epoch 357/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0149 - acc: 0.9947 - val_loss: 0.1143 - val_acc: 0.9674 Epoch 358/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0153 - acc: 0.9951 - val_loss: 0.1033 - val_acc: 0.9703 Epoch 359/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0145 - acc: 0.9959 - val_loss: 0.1133 - val_acc: 0.9684 Epoch 360/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0140 - acc: 0.9951 - val_loss: 0.1032 - val_acc: 0.9684 Epoch 361/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0134 - acc: 0.9967 - val_loss: 0.1058 - val_acc: 0.9684 Epoch 362/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0134 - acc: 0.9959 - val_loss: 0.1088 - val_acc: 0.9674 Epoch 363/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0142 - acc: 0.9951 - val_loss: 0.1071 - val_acc: 0.9674 Epoch 364/600 2433/2433 [==============================] - 0s 32us/step - loss: 0.0132 - acc: 0.9959 - val_loss: 0.1001 - val_acc: 0.9761 Epoch 365/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0142 - acc: 0.9947 - val_loss: 0.1117 - val_acc: 0.9693 Epoch 366/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0126 - acc: 0.9947 - val_loss: 0.1025 - val_acc: 0.9732 Epoch 367/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0155 - acc: 0.9942 - val_loss: 0.1130 - val_acc: 0.9732 Epoch 368/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0145 - acc: 0.9938 - val_loss: 0.1061 - val_acc: 0.9693 Epoch 369/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0140 - acc: 0.9955 - val_loss: 0.1120 - val_acc: 0.9693 Epoch 370/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0142 - acc: 0.9959 - val_loss: 0.1122 - val_acc: 0.9693 Epoch 371/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0132 - acc: 0.9967 - val_loss: 0.1141 - val_acc: 0.9693 Epoch 372/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0131 - acc: 0.9963 - val_loss: 0.1049 - val_acc: 0.9741 Epoch 373/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0133 - acc: 0.9951 - val_loss: 0.1106 - val_acc: 0.9703 Epoch 374/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0136 - acc: 0.9951 - val_loss: 0.1058 - val_acc: 0.9703 Epoch 375/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0123 - acc: 0.9959 - val_loss: 0.1030 - val_acc: 0.9732 Epoch 376/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0134 - acc: 0.9959 - val_loss: 0.1165 - val_acc: 0.9674 Epoch 377/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0133 - acc: 0.9951 - val_loss: 0.1069 - val_acc: 0.9693 Epoch 378/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0141 - acc: 0.9959 - val_loss: 0.1055 - val_acc: 0.9722 Epoch 379/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0134 - acc: 0.9955 - val_loss: 0.1051 - val_acc: 0.9732 Epoch 380/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0195 - acc: 0.9938 - val_loss: 0.1116 - val_acc: 0.9713 Epoch 381/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0133 - acc: 0.9959 - val_loss: 0.1061 - val_acc: 0.9722 Epoch 382/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0127 - acc: 0.9963 - val_loss: 0.1088 - val_acc: 0.9713 Epoch 383/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0140 - acc: 0.9963 - val_loss: 0.1182 - val_acc: 0.9665 Epoch 384/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0133 - acc: 0.9959 - val_loss: 0.1139 - val_acc: 0.9665 Epoch 385/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0128 - acc: 0.9951 - val_loss: 0.1144 - val_acc: 0.9674 Epoch 386/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0123 - acc: 0.9951 - val_loss: 0.1125 - val_acc: 0.9741 Epoch 387/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0134 - acc: 0.9951 - val_loss: 0.1083 - val_acc: 0.9703 Epoch 388/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0126 - acc: 0.9955 - val_loss: 0.1213 - val_acc: 0.9655 Epoch 389/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0138 - acc: 0.9947 - val_loss: 0.1176 - val_acc: 0.9684 Epoch 390/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0120 - acc: 0.9967 - val_loss: 0.1133 - val_acc: 0.9674 Epoch 391/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0136 - acc: 0.9963 - val_loss: 0.1225 - val_acc: 0.9674 Epoch 392/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0136 - acc: 0.9955 - val_loss: 0.1157 - val_acc: 0.9655 Epoch 393/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0123 - acc: 0.9963 - val_loss: 0.1135 - val_acc: 0.9674 Epoch 394/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0110 - acc: 0.9975 - val_loss: 0.1076 - val_acc: 0.9741 Epoch 395/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0115 - acc: 0.9975 - val_loss: 0.1057 - val_acc: 0.9722 Epoch 396/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0133 - acc: 0.9947 - val_loss: 0.1096 - val_acc: 0.9693 Epoch 397/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0116 - acc: 0.9975 - val_loss: 0.1300 - val_acc: 0.9665 Epoch 398/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0163 - acc: 0.9947 - val_loss: 0.1055 - val_acc: 0.9693 Epoch 399/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0158 - acc: 0.9942 - val_loss: 0.1161 - val_acc: 0.9703 Epoch 400/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0124 - acc: 0.9942 - val_loss: 0.1157 - val_acc: 0.9674 Epoch 401/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0114 - acc: 0.9971 - val_loss: 0.1190 - val_acc: 0.9693 Epoch 402/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0125 - acc: 0.9963 - val_loss: 0.1183 - val_acc: 0.9665 Epoch 403/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0111 - acc: 0.9963 - val_loss: 0.1154 - val_acc: 0.9693 Epoch 404/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0121 - acc: 0.9963 - val_loss: 0.1107 - val_acc: 0.9684 Epoch 405/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0133 - acc: 0.9955 - val_loss: 0.1162 - val_acc: 0.9693 Epoch 406/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0117 - acc: 0.9963 - val_loss: 0.1157 - val_acc: 0.9674 Epoch 407/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0103 - acc: 0.9963 - val_loss: 0.1325 - val_acc: 0.9655 Epoch 408/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0117 - acc: 0.9963 - val_loss: 0.1176 - val_acc: 0.9684 Epoch 409/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0145 - acc: 0.9930 - val_loss: 0.1302 - val_acc: 0.9646 Epoch 410/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0137 - acc: 0.9934 - val_loss: 0.1239 - val_acc: 0.9665 Epoch 411/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0109 - acc: 0.9963 - val_loss: 0.1233 - val_acc: 0.9665 Epoch 412/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0101 - acc: 0.9971 - val_loss: 0.1168 - val_acc: 0.9674 Epoch 413/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0114 - acc: 0.9967 - val_loss: 0.1151 - val_acc: 0.9741 Epoch 414/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0104 - acc: 0.9967 - val_loss: 0.1336 - val_acc: 0.9636 Epoch 415/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0130 - acc: 0.9955 - val_loss: 0.1329 - val_acc: 0.9636 Epoch 416/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0111 - acc: 0.9967 - val_loss: 0.1161 - val_acc: 0.9693 Epoch 417/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0106 - acc: 0.9967 - val_loss: 0.1142 - val_acc: 0.9713 Epoch 418/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0097 - acc: 0.9984 - val_loss: 0.1230 - val_acc: 0.9665 Epoch 419/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0121 - acc: 0.9959 - val_loss: 0.1245 - val_acc: 0.9665 Epoch 420/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0104 - acc: 0.9967 - val_loss: 0.1183 - val_acc: 0.9703 Epoch 421/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0103 - acc: 0.9967 - val_loss: 0.1251 - val_acc: 0.9665 Epoch 422/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0102 - acc: 0.9975 - val_loss: 0.1169 - val_acc: 0.9761 Epoch 423/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0123 - acc: 0.9963 - val_loss: 0.1253 - val_acc: 0.9713 Epoch 424/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0276 - acc: 0.9918 - val_loss: 0.1273 - val_acc: 0.9674 Epoch 425/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0115 - acc: 0.9963 - val_loss: 0.1194 - val_acc: 0.9684 Epoch 426/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0100 - acc: 0.9979 - val_loss: 0.1233 - val_acc: 0.9665 Epoch 427/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0099 - acc: 0.9979 - val_loss: 0.1156 - val_acc: 0.9722 Epoch 428/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0096 - acc: 0.9979 - val_loss: 0.1291 - val_acc: 0.9655 Epoch 429/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0101 - acc: 0.9979 - val_loss: 0.1362 - val_acc: 0.9607 Epoch 430/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0119 - acc: 0.9963 - val_loss: 0.1286 - val_acc: 0.9655 Epoch 431/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0090 - acc: 0.9975 - val_loss: 0.1188 - val_acc: 0.9732 Epoch 432/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0101 - acc: 0.9967 - val_loss: 0.1240 - val_acc: 0.9674 Epoch 433/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0100 - acc: 0.9975 - val_loss: 0.1261 - val_acc: 0.9665 Epoch 434/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0097 - acc: 0.9975 - val_loss: 0.1208 - val_acc: 0.9703 Epoch 435/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0093 - acc: 0.9979 - val_loss: 0.1222 - val_acc: 0.9703 Epoch 436/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0108 - acc: 0.9971 - val_loss: 0.1314 - val_acc: 0.9684 Epoch 437/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0102 - acc: 0.9963 - val_loss: 0.1240 - val_acc: 0.9703 Epoch 438/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0094 - acc: 0.9975 - val_loss: 0.1303 - val_acc: 0.9665 Epoch 439/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0115 - acc: 0.9959 - val_loss: 0.1309 - val_acc: 0.9646 Epoch 440/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0107 - acc: 0.9967 - val_loss: 0.1329 - val_acc: 0.9665 Epoch 441/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0287 - acc: 0.9893 - val_loss: 0.1266 - val_acc: 0.9693 Epoch 442/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0124 - acc: 0.9971 - val_loss: 0.1218 - val_acc: 0.9703 Epoch 443/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0105 - acc: 0.9971 - val_loss: 0.1322 - val_acc: 0.9646 Epoch 444/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0105 - acc: 0.9971 - val_loss: 0.1264 - val_acc: 0.9655 Epoch 445/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0096 - acc: 0.9967 - val_loss: 0.1247 - val_acc: 0.9693 Epoch 446/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0094 - acc: 0.9967 - val_loss: 0.1227 - val_acc: 0.9703 Epoch 447/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0094 - acc: 0.9963 - val_loss: 0.1372 - val_acc: 0.9655 Epoch 448/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0098 - acc: 0.9967 - val_loss: 0.1210 - val_acc: 0.9693 Epoch 449/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0091 - acc: 0.9967 - val_loss: 0.1248 - val_acc: 0.9703 Epoch 450/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0096 - acc: 0.9959 - val_loss: 0.1243 - val_acc: 0.9684 Epoch 451/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0094 - acc: 0.9975 - val_loss: 0.1248 - val_acc: 0.9684 Epoch 452/600 2433/2433 [==============================] - 0s 39us/step - loss: 0.0086 - acc: 0.9971 - val_loss: 0.1294 - val_acc: 0.9674 Epoch 453/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0091 - acc: 0.9971 - val_loss: 0.1224 - val_acc: 0.9693 Epoch 454/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0083 - acc: 0.9979 - val_loss: 0.1245 - val_acc: 0.9693 Epoch 455/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0096 - acc: 0.9967 - val_loss: 0.1208 - val_acc: 0.9732 Epoch 456/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0092 - acc: 0.9971 - val_loss: 0.1260 - val_acc: 0.9674 Epoch 457/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0113 - acc: 0.9959 - val_loss: 0.1303 - val_acc: 0.9665 Epoch 458/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0124 - acc: 0.9951 - val_loss: 0.1291 - val_acc: 0.9713 Epoch 459/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0102 - acc: 0.9967 - val_loss: 0.1225 - val_acc: 0.9674 Epoch 460/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0085 - acc: 0.9979 - val_loss: 0.1272 - val_acc: 0.9684 Epoch 461/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0092 - acc: 0.9979 - val_loss: 0.1180 - val_acc: 0.9693 Epoch 462/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0089 - acc: 0.9971 - val_loss: 0.1278 - val_acc: 0.9665 Epoch 463/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0092 - acc: 0.9984 - val_loss: 0.1303 - val_acc: 0.9655 Epoch 464/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0094 - acc: 0.9971 - val_loss: 0.1255 - val_acc: 0.9674 Epoch 465/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0087 - acc: 0.9975 - val_loss: 0.1264 - val_acc: 0.9684 Epoch 466/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0079 - acc: 0.9975 - val_loss: 0.1257 - val_acc: 0.9693 Epoch 467/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0087 - acc: 0.9979 - val_loss: 0.1195 - val_acc: 0.9713 Epoch 468/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0094 - acc: 0.9975 - val_loss: 0.1323 - val_acc: 0.9665 Epoch 469/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0103 - acc: 0.9967 - val_loss: 0.1330 - val_acc: 0.9693 Epoch 470/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0089 - acc: 0.9975 - val_loss: 0.1333 - val_acc: 0.9684 Epoch 471/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0080 - acc: 0.9984 - val_loss: 0.1239 - val_acc: 0.9684 Epoch 472/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0087 - acc: 0.9979 - val_loss: 0.1297 - val_acc: 0.9684 Epoch 473/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0091 - acc: 0.9971 - val_loss: 0.1273 - val_acc: 0.9703 Epoch 474/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0080 - acc: 0.9975 - val_loss: 0.1273 - val_acc: 0.9693 Epoch 475/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0078 - acc: 0.9979 - val_loss: 0.1235 - val_acc: 0.9703 Epoch 476/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0078 - acc: 0.9979 - val_loss: 0.1323 - val_acc: 0.9665 Epoch 477/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0075 - acc: 0.9979 - val_loss: 0.1277 - val_acc: 0.9674 Epoch 478/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0076 - acc: 0.9984 - val_loss: 0.1282 - val_acc: 0.9684 Epoch 479/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0086 - acc: 0.9967 - val_loss: 0.1422 - val_acc: 0.9665 Epoch 480/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0074 - acc: 0.9984 - val_loss: 0.1335 - val_acc: 0.9655 Epoch 481/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0085 - acc: 0.9979 - val_loss: 0.1461 - val_acc: 0.9636 Epoch 482/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0089 - acc: 0.9984 - val_loss: 0.1329 - val_acc: 0.9703 Epoch 483/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0085 - acc: 0.9971 - val_loss: 0.1599 - val_acc: 0.9588 Epoch 484/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0095 - acc: 0.9971 - val_loss: 0.1321 - val_acc: 0.9693 Epoch 485/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0105 - acc: 0.9959 - val_loss: 0.1340 - val_acc: 0.9703 Epoch 486/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0086 - acc: 0.9971 - val_loss: 0.1418 - val_acc: 0.9655 Epoch 487/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0094 - acc: 0.9975 - val_loss: 0.1475 - val_acc: 0.9646 Epoch 488/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0088 - acc: 0.9967 - val_loss: 0.1330 - val_acc: 0.9655 Epoch 489/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0083 - acc: 0.9975 - val_loss: 0.1320 - val_acc: 0.9722 Epoch 490/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0079 - acc: 0.9988 - val_loss: 0.1328 - val_acc: 0.9684 Epoch 491/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0075 - acc: 0.9971 - val_loss: 0.1315 - val_acc: 0.9693 Epoch 492/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0082 - acc: 0.9988 - val_loss: 0.1313 - val_acc: 0.9693 Epoch 493/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0074 - acc: 0.9984 - val_loss: 0.1325 - val_acc: 0.9684 Epoch 494/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0073 - acc: 0.9979 - val_loss: 0.1499 - val_acc: 0.9636 Epoch 495/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0084 - acc: 0.9975 - val_loss: 0.1384 - val_acc: 0.9665 Epoch 496/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0074 - acc: 0.9979 - val_loss: 0.1454 - val_acc: 0.9655 Epoch 497/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0087 - acc: 0.9971 - val_loss: 0.1370 - val_acc: 0.9732 Epoch 498/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0063 - acc: 0.9971 - val_loss: 0.1483 - val_acc: 0.9655 Epoch 499/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0095 - acc: 0.9967 - val_loss: 0.1345 - val_acc: 0.9665 Epoch 500/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0071 - acc: 0.9975 - val_loss: 0.1447 - val_acc: 0.9655 Epoch 501/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0082 - acc: 0.9975 - val_loss: 0.1352 - val_acc: 0.9674 Epoch 502/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0075 - acc: 0.9988 - val_loss: 0.1342 - val_acc: 0.9732 Epoch 503/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0076 - acc: 0.9975 - val_loss: 0.1392 - val_acc: 0.9636 Epoch 504/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0074 - acc: 0.9979 - val_loss: 0.1428 - val_acc: 0.9646 Epoch 505/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0064 - acc: 0.9988 - val_loss: 0.1359 - val_acc: 0.9684 Epoch 506/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0068 - acc: 0.9992 - val_loss: 0.1418 - val_acc: 0.9674 Epoch 507/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0073 - acc: 0.9975 - val_loss: 0.1445 - val_acc: 0.9655 Epoch 508/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0062 - acc: 0.9984 - val_loss: 0.1447 - val_acc: 0.9665 Epoch 509/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0075 - acc: 0.9979 - val_loss: 0.1360 - val_acc: 0.9713 Epoch 510/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0563 - acc: 0.9856 - val_loss: 0.1344 - val_acc: 0.9665 Epoch 511/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0123 - acc: 0.9951 - val_loss: 0.1215 - val_acc: 0.9674 Epoch 512/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0086 - acc: 0.9975 - val_loss: 0.1388 - val_acc: 0.9646 Epoch 513/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0072 - acc: 0.9984 - val_loss: 0.1349 - val_acc: 0.9646 Epoch 514/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0068 - acc: 0.9979 - val_loss: 0.1456 - val_acc: 0.9655 Epoch 515/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0074 - acc: 0.9984 - val_loss: 0.1340 - val_acc: 0.9722 Epoch 516/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0056 - acc: 0.9992 - val_loss: 0.1339 - val_acc: 0.9674 Epoch 517/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0056 - acc: 0.9992 - val_loss: 0.1375 - val_acc: 0.9665 Epoch 518/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0066 - acc: 0.9984 - val_loss: 0.1480 - val_acc: 0.9646 Epoch 519/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0055 - acc: 0.9988 - val_loss: 0.1383 - val_acc: 0.9684 Epoch 520/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0059 - acc: 0.9988 - val_loss: 0.1388 - val_acc: 0.9674 Epoch 521/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0055 - acc: 0.9996 - val_loss: 0.1418 - val_acc: 0.9655 Epoch 522/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0055 - acc: 0.9988 - val_loss: 0.1384 - val_acc: 0.9684 Epoch 523/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0059 - acc: 0.9988 - val_loss: 0.1376 - val_acc: 0.9665 Epoch 524/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0065 - acc: 0.9984 - val_loss: 0.1417 - val_acc: 0.9665 Epoch 525/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0060 - acc: 0.9988 - val_loss: 0.1421 - val_acc: 0.9665 Epoch 526/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0061 - acc: 0.9984 - val_loss: 0.1485 - val_acc: 0.9665 Epoch 527/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0060 - acc: 0.9988 - val_loss: 0.1388 - val_acc: 0.9684 Epoch 528/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0066 - acc: 0.9979 - val_loss: 0.1384 - val_acc: 0.9693 Epoch 529/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0075 - acc: 0.9984 - val_loss: 0.1411 - val_acc: 0.9665 Epoch 530/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0061 - acc: 0.9992 - val_loss: 0.1363 - val_acc: 0.9693 Epoch 531/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0057 - acc: 0.9988 - val_loss: 0.1522 - val_acc: 0.9655 Epoch 532/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0054 - acc: 0.9992 - val_loss: 0.1473 - val_acc: 0.9665 Epoch 533/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0063 - acc: 0.9984 - val_loss: 0.1359 - val_acc: 0.9693 Epoch 534/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0056 - acc: 0.9988 - val_loss: 0.1413 - val_acc: 0.9655 Epoch 535/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0057 - acc: 0.9992 - val_loss: 0.1494 - val_acc: 0.9665 Epoch 536/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0060 - acc: 0.9988 - val_loss: 0.1383 - val_acc: 0.9703 Epoch 537/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0057 - acc: 0.9988 - val_loss: 0.1586 - val_acc: 0.9617 Epoch 538/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0093 - acc: 0.9975 - val_loss: 0.1441 - val_acc: 0.9665 Epoch 539/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0057 - acc: 0.9992 - val_loss: 0.1377 - val_acc: 0.9713 Epoch 540/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0054 - acc: 0.9992 - val_loss: 0.1419 - val_acc: 0.9713 Epoch 541/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0068 - acc: 0.9984 - val_loss: 0.1475 - val_acc: 0.9665 Epoch 542/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0050 - acc: 0.9988 - val_loss: 0.1432 - val_acc: 0.9655 Epoch 543/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0057 - acc: 0.9988 - val_loss: 0.1512 - val_acc: 0.9655 Epoch 544/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0061 - acc: 0.9988 - val_loss: 0.1425 - val_acc: 0.9674 Epoch 545/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0057 - acc: 0.9988 - val_loss: 0.1436 - val_acc: 0.9684 Epoch 546/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0053 - acc: 0.9988 - val_loss: 0.1458 - val_acc: 0.9674 Epoch 547/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0090 - acc: 0.9967 - val_loss: 0.1579 - val_acc: 0.9655 Epoch 548/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0059 - acc: 0.9992 - val_loss: 0.1388 - val_acc: 0.9693 Epoch 549/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0058 - acc: 0.9992 - val_loss: 0.1364 - val_acc: 0.9722 Epoch 550/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0085 - acc: 0.9975 - val_loss: 0.1488 - val_acc: 0.9665 Epoch 551/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0075 - acc: 0.9979 - val_loss: 0.1601 - val_acc: 0.9646 Epoch 552/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0065 - acc: 0.9979 - val_loss: 0.1502 - val_acc: 0.9693 Epoch 553/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0069 - acc: 0.9984 - val_loss: 0.1669 - val_acc: 0.9617 Epoch 554/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0061 - acc: 0.9988 - val_loss: 0.1571 - val_acc: 0.9646 Epoch 555/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0046 - acc: 0.9992 - val_loss: 0.1495 - val_acc: 0.9655 Epoch 556/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0056 - acc: 0.9988 - val_loss: 0.1500 - val_acc: 0.9674 Epoch 557/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0044 - acc: 0.9992 - val_loss: 0.1486 - val_acc: 0.9665 Epoch 558/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0052 - acc: 0.9988 - val_loss: 0.1535 - val_acc: 0.9655 Epoch 559/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0056 - acc: 0.9996 - val_loss: 0.1513 - val_acc: 0.9665 Epoch 560/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0066 - acc: 0.9979 - val_loss: 0.1587 - val_acc: 0.9665 Epoch 561/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0068 - acc: 0.9988 - val_loss: 0.1427 - val_acc: 0.9703 Epoch 562/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0068 - acc: 0.9975 - val_loss: 0.1514 - val_acc: 0.9674 Epoch 563/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0047 - acc: 0.9988 - val_loss: 0.1594 - val_acc: 0.9655 Epoch 564/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0064 - acc: 0.9988 - val_loss: 0.1448 - val_acc: 0.9693 Epoch 565/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0055 - acc: 0.9996 - val_loss: 0.1579 - val_acc: 0.9655 Epoch 566/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0060 - acc: 0.9984 - val_loss: 0.1503 - val_acc: 0.9693 Epoch 567/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0049 - acc: 0.9992 - val_loss: 0.1469 - val_acc: 0.9684 Epoch 568/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0048 - acc: 0.9984 - val_loss: 0.1518 - val_acc: 0.9684 Epoch 569/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0076 - acc: 0.9979 - val_loss: 0.1452 - val_acc: 0.9684 Epoch 570/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0117 - acc: 0.9951 - val_loss: 0.1617 - val_acc: 0.9646 Epoch 571/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0064 - acc: 0.9984 - val_loss: 0.1522 - val_acc: 0.9684 Epoch 572/600 2433/2433 [==============================] - 0s 33us/step - loss: 0.0052 - acc: 0.9988 - val_loss: 0.1512 - val_acc: 0.9674 Epoch 573/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0053 - acc: 0.9992 - val_loss: 0.1544 - val_acc: 0.9665 Epoch 574/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0048 - acc: 0.9988 - val_loss: 0.1561 - val_acc: 0.9665 Epoch 575/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0051 - acc: 0.9988 - val_loss: 0.1538 - val_acc: 0.9655 Epoch 576/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0054 - acc: 0.9979 - val_loss: 0.1528 - val_acc: 0.9674 Epoch 577/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0046 - acc: 0.9992 - val_loss: 0.1575 - val_acc: 0.9741 Epoch 578/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0055 - acc: 0.9988 - val_loss: 0.1567 - val_acc: 0.9655 Epoch 579/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0058 - acc: 0.9984 - val_loss: 0.1518 - val_acc: 0.9741 Epoch 580/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0053 - acc: 0.9979 - val_loss: 0.1600 - val_acc: 0.9665 Epoch 581/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0045 - acc: 0.9988 - val_loss: 0.1503 - val_acc: 0.9684 Epoch 582/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0045 - acc: 0.9984 - val_loss: 0.1567 - val_acc: 0.9674 Epoch 583/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0093 - acc: 0.9979 - val_loss: 0.1710 - val_acc: 0.9665 Epoch 584/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0066 - acc: 0.9975 - val_loss: 0.1614 - val_acc: 0.9665 Epoch 585/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0054 - acc: 0.9992 - val_loss: 0.1531 - val_acc: 0.9693 Epoch 586/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0054 - acc: 0.9992 - val_loss: 0.1645 - val_acc: 0.9665 Epoch 587/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0049 - acc: 0.9992 - val_loss: 0.1560 - val_acc: 0.9674 Epoch 588/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0051 - acc: 0.9992 - val_loss: 0.1541 - val_acc: 0.9693 Epoch 589/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0047 - acc: 0.9992 - val_loss: 0.1519 - val_acc: 0.9722 Epoch 590/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0040 - acc: 0.9992 - val_loss: 0.1511 - val_acc: 0.9693 Epoch 591/600 2433/2433 [==============================] - 0s 40us/step - loss: 0.0057 - acc: 0.9988 - val_loss: 0.1646 - val_acc: 0.9665 Epoch 592/600 2433/2433 [==============================] - 0s 36us/step - loss: 0.0050 - acc: 0.9988 - val_loss: 0.1631 - val_acc: 0.9665 Epoch 593/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0047 - acc: 0.9992 - val_loss: 0.1585 - val_acc: 0.9665 Epoch 594/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0059 - acc: 0.9979 - val_loss: 0.1693 - val_acc: 0.9646 Epoch 595/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0083 - acc: 0.9963 - val_loss: 0.1673 - val_acc: 0.9665 Epoch 596/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0048 - acc: 0.9984 - val_loss: 0.1738 - val_acc: 0.9646 Epoch 597/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0190 - acc: 0.9947 - val_loss: 0.1608 - val_acc: 0.9646 Epoch 598/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0082 - acc: 0.9975 - val_loss: 0.1797 - val_acc: 0.9674 Epoch 599/600 2433/2433 [==============================] - 0s 34us/step - loss: 0.0055 - acc: 0.9975 - val_loss: 0.1691 - val_acc: 0.9674 Epoch 600/600 2433/2433 [==============================] - 0s 35us/step - loss: 0.0056 - acc: 0.9992 - val_loss: 0.1594 - val_acc: 0.9665
<tensorflow.python.keras.callbacks.History at 0x129bf2610>
# Create a dataframe of losses and accuracy before evaluating the performance on the test set.
losses = pd.DataFrame(model.history.history)
# Use a head() function to extract first five rows.
losses.head()
| val_loss | val_acc | loss | acc | |
|---|---|---|---|---|
| 0 | 0.175707 | 0.969349 | 0.353923 | 0.900534 |
| 1 | 0.158727 | 0.969349 | 0.167742 | 0.967530 |
| 2 | 0.149739 | 0.969349 | 0.156495 | 0.967530 |
| 3 | 0.138485 | 0.969349 | 0.144068 | 0.967530 |
| 4 | 0.125024 | 0.969349 | 0.130668 | 0.967530 |
# Visualize an accuracy and loss on the plot.
losses.plot(figsize = (16,9), title = "Validation and training accuracy & loss plot")
plt.xlabel('Epochs')
plt.ylabel('Loss')
Text(0, 0.5, 'Loss')
The true depiction of running an artificial neural network (ANN) model is shown in the validation accuracy & loss and training accuracy & loss plot. Since the training and validation losses have dropped, this is a positive sign for model evaluation.
# Evaluate the artificial neural network model on the test dataset.
new_pred = model.predict_classes(X_test5)
# Show the classification matrix and confusion matrix.
print(classification_report(y_test5, new_pred))
print('\n')
# Show the Confusion Matrix.
cnf_matrix = confusion_matrix(y_test5, new_pred)
class_names=[0,1] # name of classes
fig, ax = plt.subplots()
tick_marks = np.arange(len(class_names))
plt.xticks(tick_marks, class_names)
plt.yticks(tick_marks, class_names)
# Use sns.heatmap() function to display the confusion matriix on the heatmap.
sns.heatmap(pd.DataFrame(cnf_matrix), annot=True,
cmap="icefire_r" ,fmt='g')
ax.xaxis.set_label_position("top")
ax.xaxis.set_ticks_position("top")
plt.title('Confusion matrix', y=1.2)
plt.ylabel('Actual label')
plt.xlabel('Predicted label')
plt.tight_layout()
precision recall f1-score support
0 0.99 0.98 0.98 1012
1 0.46 0.56 0.51 32
accuracy 0.97 1044
macro avg 0.72 0.77 0.74 1044
weighted avg 0.97 0.97 0.97 1044
After carefully analyzing the classification report and the confusion matrix, an artificial neural network is not as remarkable as the random forest. Also, it revealed a large number of misclassified values which makes it an inferior model in comparison to other supervised machine learning models.
# Show an accuracy, precision, recall, and f1 score.
ann_acc = metrics.accuracy_score(y_test5, new_pred)*100
print("Accuracy:",round(metrics.accuracy_score(y_test5, new_pred)*100,3),"%.")
print("Precision:",round(metrics.precision_score(y_test5, new_pred)*100,3),"%.")
print("Recall:",round(metrics.recall_score(y_test5, new_pred)*100,3),"%.")
print("F1 Score:",round(metrics.f1_score(y_test5, new_pred)*100,3),"%.")
Accuracy: 96.648 %. Precision: 46.154 %. Recall: 56.25 %. F1 Score: 50.704 %.
# Compute an error_rate of the artificial neural network.
ann_err = round(np.mean(y_test5 != new_pred)*100,3)
print("The error rate is",ann_err,"%.")
The error rate is 6.572 %.
In contrast to all other models, the artificial neural network had an extremely high error rate. This is not something the retail marketing department should utilize.
# Create a dictionary which takes all the supervised machine learning algorithms.
dat = {'Logistic': [log_acc, log_err], 'KNN':[knn_acc, knn_err],
'Decision Tree':[decision_acc, decision_err],
'Random Forest':[random_acc, random_err],
'SVM':[svm_acc, svm_err],'ANN':[ann_acc, ann_err]}
# Create a new dataframe to show algorithms.
datframe = pd.DataFrame(dat, index=['Accuracy', 'Error'])
datframe
| Logistic | KNN | Decision Tree | Random Forest | SVM | ANN | |
|---|---|---|---|---|---|---|
| Accuracy | 97.509579 | 97.796935 | 97.701149 | 98.371648 | 97.988506 | 96.64751 |
| Error | 2.490000 | 2.203000 | 2.299000 | 1.628000 | 2.011000 | 6.57200 |
I showed a summary table about the accuracy and errors of the deployed machine learning algorithms in this project. As shown above random forest is the best machine learning algorithim.
After a thorough investigation, feature engineering, and exploration of the dataset variables, some of the most prominent supervised machine learning algorithms were deployed in this project. The objective of utilizing these advanced algorithms was to find the optimum algorithm for assisting the retail marketing department in identifying potential customers who are more likely to buy the loan. As a result, after testing the most sophisticated algorithms, I discovered that the random forest algorithm is the best fit for this dataset. The bank should apply the random forest algorithm, which will assist the bank's retail marketing department in cutting campaign costs. Last but not least, the random forest accurately predicted whether potential consumers would accept or refuse a personal loan in future campaigns.